Serialization error while archiving

My archive cronjob always breaks for a specific tracked URL. In DataTable.php I found that function unserializeRows (line 1465) always evaluates $rows to false, i.e throws “The unserialization has failed!” even if there is a regular looking array.

Testing archivephp with other tracked URLs returns regular json:
{“idarchives”:[296176],“nb_visits”:1166}

I used the following console command for testing archivephp:
/usr/bin/php /var/www/piwik/console climulti:request --matomo-domain=“https://xxx” --superuser “module=API&method=CoreAdminHome.archiveReports&idSite=58&period=year&date=2024-01-01&format=json&trigger=archivephp”

I already tried increasing memory and max_execution. I am working with Matomo 5.1.2 and php 8.3; charset for all table is utf8m4 and collation utf8mb4_general_ci.

Has anyone an idea where to check more for the error?

Thank you

Hi @mucctecc
In my case, to increase archiving process resilience, I limited the number of sites being archived in one archiving process thanks to max-websites-to-process parameter:

Dear Philippe,

thank you for your hint. I already tried using max-values-to-process with different values, I also tried the concurrent options with different values, e.g.

/usr/bin/php -d memory_limit=20G -d max_execution_time=0 /var/www/piwik/console core:archive --concurrent-requests-per-website=1 --concurrent-archivers=1 --url=https://xxx

That doesn’t work either. It is this unserialization error in core/DataTable.php, I checked the charset and the collation of the tables, they are as defined in config.ini.php.

Hi @mucctecc
For which kind of archive do you encounter the issue? daily, weekly, monthly, yearly? Specific defined period? (check the cron log)
What is the daily tracking volume? How many segments? How many installed Matomo plugins?
Can you see some errors in the Matomo error log file?