How to archive when using 2 mysql servers


#1

Hello.
We are running Piwik 1.5.1 and have a question concerning running the archive cronjob.
Setup is a system with different web AND mysql servers for frontend AND backend. As we have a seperated frontend and backend environment, also concerning mysql database, we have 2 Piwik installations.

The mysql data from the frontend environment is dumped (when newer) into the backend environment daily.
All this brings some problems.
When we are working with settings, users, sites, we have to make sure those settings are done on the frontend sys (as well). Otherwise we are running the risk of losing data when updating…
All optimizing on tables in the backend has no sense when the data will be overwritten with frontend data…

It looks like only the following tables are changing in the live sys and will be dumped:

piwik_log_action
piwik_log_visit
piwik_log_link_visit_action
piwik_session

Maybe it will be a good idea to explicitly just dump those tables (better without the session table, I suppose).
If admin settings are done on the admin sys, maybe those tables should be dumped the other way
Which are the tables concerned? I assume

piwk_option
piwik_site*
piwik_user*

We have around 12 sites with a database size at around 4,3GB.

How do we have to set up the archive cronjob?
We poorly have to admint that it was never running up to now…

My idea is to run it once daily before the data will be dumped to the admin sys.
When archive.sh was running succesfully ONCE, is all older data archived correctly, so that we can delete older logs and have the overviews nevertheless?

Thanks in advance,
Christian


#2

Just out of curiosity why did you decide to go with two DB servers? Why not a Master to Master replication?


#3

Hello JMETRIX,
Sometime I ask this myself. But the reason was kind of complex. The server structure is different with more instances running on different ports. The backend system also has different jobs to do.

c


#4

I see. Well your life would be easier if you did not do that. You are using different nodes for the Piwik installation? So A server has Piwik and B has the DB? (just as a simplistic way to say it). If you kick off your archive.sh script manually what happens? Did you set it to save to a log file so you can check out the results from running it? Also ensure the servers can and are allowed to communicate with one another. (iptables,firewall)


#5

Hello again.
I’m not sure if you got it correctly. Its A frontend web with B Frontend Databases and C backend web with D backend Databases, basically.
Communication between servers is not the problem. I just want to know more or less which tables will be written or changed by the archive script.
Regards,
c


#6

Please: Could you give me a hint how the tables are written from front- and backend?
To me it looks like log_* is written by front end calls, the rest from backend.
What about piwik_logger* tables?

My idea is to archive daily in the backend and delete old entries in the front end, concerning the log tables…
This way I would only dump the log tables into the backend.

Any comments ?
Regards,
Christian