Log analytics: Too many entries in logfiles

I have tried to import some logfiles from a IIS-server running an Umbraco-website into a local-matomo-installation, but ran into problems efter 5-10 logfiles because it’s importing too many entries and then the API/frontend stops working.

This message is shown in the browser-console:
Failed to load resource: the server responded with a status of 500 (Internal Server Error)
http://localhost:808/index.php?forceView=1&viewDataTable=sparklines&module=VisitsSummary&action=get&idSite=1&period=range&date=2020-04-01,2020-04-30&segment=&showtitle=1&random=5706

The problem is this small-int field in the database: visit_total_interactions
When I import logfiles, after a few log-days, the number gets to big and the API crashes its queries.

Solutions I have tried:
First I changed the import script, so it could exclude certain IP-adresses (like the path-exclusion), and made a script to analyse the logfiles for the most-used IP-adresses, so the number of logs was brought down. But after a few logfiles, I still hit the roof of how big a number visit_total_interactions can hold.

Then I tried modifying the database-field (from smallint to bigint):
ALTER TABLE matomo_log_visit MODIFY visit_total_interactions BIGINT;

After the DB-field-change I could import the files from a whole month, but the API/frontend still hits the same problem, where the number of queries is too large.

So now I’m basicly out of ideas to what to do from here - do you have any suggestions of what else I could try?

Here is the local-installation info:
Matomo-version: 3.14.1
MySQL-version: 8.0.21
PHP-version: 7.4.1