It seems as if python doesn’t play well with the firewall the server is behind, despite the fact that the URL I’m attempting to connect to is on the same server as the log files.
At least, that’s what I’m inferring from the error I’m getting:
Fatal error: [Errno 110] Connection timed out
Is there a trick or config option I can use to get around this?
Check your server error logs to see the exact error ?
This is from when I’m running the import_log.py script from the shell, there aren’t any entries in the error logs for it.
Here’s what I get running a connection test to a local domain and an external domain from the server:
# curl -I http://my.domain.com
curl: (7) couldn't connect to host
# curl -I http://www.yahoo.com
HTTP/1.1 200 OK
(plus a ton of other info)
This is what I do see in the error logs though, over and over again every few hours:
mod_fcgid: read data timeout in 31 seconds, referer: /index.php?module=MultiSites&action=index&idSite=1&period=month&date=today
Premature end of script headers: index.php, referer: /index.php?module=MultiSites&action=index&idSite=1&period=month&date=today
File does not exist: themes/logo.png, referer: /index.php?module=CoreAdminHome&action=generalSettings&idSite=1&period=month
Running 1.9b2, fyi.
it looks like the “all websites dashboard” request is timing out. Make sure you setup the cron as explained in: How to Set up Auto-Archiving of Your Reports - Analytics Platform - Matomo
this should fix this error as the reports will be pre-processed.
But that’s not linked to the import_logs so I’m not sure for that?
the set of errors from the error_log seems to have gone away after I set the interval to 15 minutes. None of the sites currently tracking generate enough hits to use the cron yet, I’m holding off on that until I know I can import the logs for the two big sites first.
Still not sure why the server cannot connect to the virtual domains that are hosted on it, so I’ve asked the guys who manage the DNS and firewall about it… we’ll see what they say.