At the moment I have two sites out of the thirty we are tracking with piwik that are causing allowed memory size problems when updates are done either via the browser, via cron or via a shell command.
Any sites that come after one of these sites no longer gets updated as the script bombs out as soon as it hits the allowed memory size error.
Apart from moving the whole piwik install to a different server we are now stuck.
Heres a few suggestions
[ol]
[li] Allow us to enable or disable sites
[/li][li] Allow us to target a specific id in the archive script
[/li][li] Make the archive script work in batches in the way the mysql script BigDump works
[/li][/ol]
by batches i meant within each site. at the moment any site that has too much traffic causes the script to fail. if it managed the archive in chunks it might not have this problem.
It will process every site by day, week, month and year separately (so practically in chunks). This could probably be done better (by doing it in a batch within one file), but it’s a start.
I understand from this that even when reports for days are available, Piwik will generate a month by generating the report from the ‘raw data’ again? (So instead of adding the 30/31 daily reports together in order to make a monthly report it will use the ‘raw data’?)