What are the recommended steps to keep Matomo running smoothly for high-traffic websites?

Hi everyone,
I’ve recently deployed Matomo on my company’s server to track analytics for a growing website. Traffic has been increasing steadily, and I want to make sure the setup remains stable as we scale.

A few areas I’d like advice on:

  • Database optimization tips for handling larger datasets

  • Best practices for configuring archiving and reports

  • How to balance performance and data retention policies

  • Any recommended plugins or configurations that improve reliability

  • Security measures would regular penetration testing be useful to ensure Matomo stays protected while collecting sensitive user data?

Would appreciate insights from anyone running Matomo on busy websites what worked well for you, and what pitfalls should I avoid?

Thanks in advance!

The multi-server approach works well for us. We run archiving-jobs and DB on a separate server. For the DB, we have a primary and secondary setup. We also queue with Redis.

To keep Matomo running smoothly on high-traffic websites, it’s important to optimize both the infrastructure and the analytics configuration. One common recommendation is using a multi-server setup, where the database, archiving jobs, and application servers run separately to improve scalability and reliability Many setups also use Redis or MySQL queues for tracking requests, which helps handle traffic spikes efficiently. In addition, enabling PHP OPcache, proper MySQL tuning, and scheduled CLI archiving (via cron jobs) can significantly reduce server load and speed up report generation. For larger deployments, adding load balancing, caching layers, and optimized database settings ensures Matomo remains stable even with millions of page views.