More of 13 hours of processing archive (1400+ segments)


I was wondering if it’s a normal behavior that it takes more than 12 hours to process 1400 segments and about 100 websites ?

Segments have max 2 items (a custom dimension and a pageURL), no more.

I have a VM with 4 cores & 16Gb RAM and a dedicated database with 2 cores and 1Gb RAM (max connexion 30)

Sometimes it take more than 160 secondes to process a segment which has 0 visit…

Log table is about 1Gb.

This is the command in the crontab launched each day :

www-data /usr/bin/php /var/www/matomo/console core:archive --force-all-websites --concurrent-requests-per-website=10 --url=analytics.url

Thank you,

Hi @MrJibus
Can you check, in the log of the archive process, how long does it take for the longest and average archiving process?
Also how did you configure the unique visitor calculation duration? (day, week, month, year, range)

How is your database usage? 13h is very long

Check it simply with module=DBStats