We (a non profit) are tracking about 75-100 million visits a year on a dedicated server (8CPU 32 GB) (and another one for testing). I have been using Tag manager for a while now. Everthing was working like a charm until yesterday when I needed to analyze on page user interaction: on page anchor menu clicks and scroll reach. This could only be done by triggering 1 to 4 new tags per page / every page (on top of what we already had: two data layers which are being used by custom dimensions and the occasional event tracker). All in all it meant A LOT more entries in the visitors logs. The data (events) was being used in maybe 6 custom reports (we are using a cron job of course)
This clearly was too much for our poor server. The main website wasn’t affected and was still being tracked (as far as I can tell now, not entirely sure if all visits have been recorded though) but a smaller satellite website was not being tracked anymore because the tracking script wasn’t loaded within the set time limit. Furthermore the Matomo UI became pretty much inaccessible / nonresponive.
I am scratching my head here, was this to be expected? Was I being naive? I did expect it to put a strain on the server but it went from smooth as a whistle to being at death’s door very fast! What would be best practice in this case? Should I have never done it this way? We do have a bunch of premium plugin like heatmaps. I couldn’t really use that one because I needed an overall indication, and heat, scroll and clickmaps are being used on specific pages.
Or is there nothing wrong with this method and is it just a matter of server resources?
Not a clear question really I know but I value your opinions so whatever comes to mind…