The import_logs.py process throw some errors and I can’t fixed.
Could someone help me?
Environment:
DB: Mysql 120 GB
Load 4000000 request per day
RAM: 4GB
Launch the process with this parameters:
python /var/piwik/misc/log-analytics/import_logs.py --url=my_URL --recorders=2 --recorder-max-payload-size=1000 --replay-tracking *.log
ERROR 1:
016/02/20 02:14:56 [error] 7851#0: *4784 FastCGI sent in stderr: "PHP message: Error in Piwik (tracker): Error query: SQLSTATE[HY000]: General error: 1205 Lock wait timeout exceeded; try restarting transaction In query: UPDATE piwik_log_visit SET idvisitor = ?, visit_last_action_time = ?, visit_exit_idaction_name = ?, visit_exit_idac_’, …
ERROR 2:
2016/02/19 23:30:11 [error] 7851#0: *4431 FastCGI sent in stderr: "PHP message: Error in Piwik (tracker): Error query: SQLSTATE[40001]: Serialization failure: 1213 Deadlock found when trying to get lock; try restarting transaction In query: UPDATE piwik_log_visit SET idvisitor = ?, visit_last_action_time = ?, visit_exit_idaction_name = ?, visit_exit_idaction_url = ?, visit_total_actions = visit_total_actions + 1 , visit_total_time = ? WHERE idsite = ? AND idvisit = ? Parameters: array ( …
You don’t have a really good hardware to support 120M actions per month. I suggest to slow down with the log importer. Try to import data with smaller payload (200 is the default value, I suggest going with 100 requests per one bulk request) and 1 recorder. If everything goes well, try to tune it a little bit.
The database is not able to handle the load when there are too many requests at the same time. Of course it could be optimized on the app level, but currently, the only option is to upgrade the hardware. Otherwise you have to reduce the speed of the importing process.
UPDATE piwik_log_visit SET idvisitor = … WHERE idsite = ‘…’ AND idvisit = ‘…’
so I create a new index on that table on (idsite,idvisit) because that sql statement needed a lot of time…
After that I created that index and configured the import process wih recorders=2 and payload=1000, the import process start to load all data without down it but the error message continue…