Memory limit hit on console archiving

Hello,

I’m running an instance with more than 6k sites and in total around 40GB of data.
Since upgrading to 3.5.1 from 2.16.x I’m getting memory limit errors during the archiving run
PHP Fatal error: Allowed memory size of 3196059648 bytes exhausted (tried to allocate 4194304 bytes) in /…/stats/et/core/Db.php on line 595

I assume its around the end of the run, maybe during deleting old data because that hasn’t run anymore since the upgrade (according to the anonymize data page).

As its already using 3GB - what can I do???

Thanks,

Thomas

You could try to not archive all sites at once? Could help with this.

well, manually selecting the sites to archive in each run is somehow out of question with that amount.
and as it was working in the past … there should be a way to do it still :confused:

Hi,

You must modify the php limits of your server (memory limit or time limit).

There should be no time limit per default in CLI mode and the script is already consuming over 3GB (3196059648 bytes) RAM which is not normal…

Which commands lead to the error exactly? Try to run the archive command with a higher verbosity and post the (anonymized) commands here.