At the moment I have two sites out of the thirty we are tracking with piwik that are causing allowed memory size problems when updates are done either via the browser, via cron or via a shell command.
Any sites that come after one of these sites no longer gets updated as the script bombs out as soon as it hits the allowed memory size error.
Apart from moving the whole piwik install to a different server we are now stuck.
Heres a few suggestions
[li] Allow us to enable or disable sites
[/li][li] Allow us to target a specific id in the archive script
[/li][li] Make the archive script work in batches in the way the mysql script BigDump works
I’ve separated the archiving process in day, week, month and year (so you can schedule these to run at different times).
The scripts already works in batches (I’m not sure what you mean exactly though); first site 1, then 2, then 3, etc.
by batches i meant within each site. at the moment any site that has too much traffic causes the script to fail. if it managed the archive in chunks it might not have this problem.
You should try the files I linked to
It will process every site by day, week, month and year separately (so practically in chunks). This could probably be done better (by doing it in a batch within one file), but it’s a start.
Do let me know whether it helped
I tried something similar myself but had the same problem.
for example if i were to use your file for monthly archive two of my sites would fail the script as would the yearly one.
the day and week archives work ok at the moment. Ive basically edited out the month and year for now so that the cron doesnt fail
I understand from this that even when reports for days are available, Piwik will generate a month by generating the report from the ‘raw data’ again? (So instead of adding the 30/31 daily reports together in order to make a monthly report it will use the ‘raw data’?)