A way to effectively archive stats for 50000+ websites

Hello,

My virtual tour company is running Piwik to track visitors to tours and right now we have more than 50,000 tours active, each considered a separate website by Piwik. If I run archive.php it takes longer than a week on a dedicated web server with four Xeon processors to process the data. Database is stored on a separate, dedicated quad-core Xeon server.
This isn’t practical. Is there any way to make it fit within 24 hour window (ideally, 2-3 hours)?

Thank you!

Actually, based on the log below, the initial archivation process is going to take several months.
Why a website can take 4-5 minutes to archive logs?? With zero visits.

INFO [2014-03-27 09:01:25] [0afe0] Archived website id = 987, period = week, 0 visits, Time elapsed: 271.893s
INFO [2014-03-27 09:01:39] [0afe0] Archived website id = 987, period = month, 0 visits, Time elapsed: 14.389s
INFO [2014-03-27 09:01:41] [0afe0] Archived website id = 987, period = year, 0 visits, Time elapsed: 1.970s
INFO [2014-03-27 09:01:41] [0afe0] Archived website id = 987, today = 0 visits, 4 API requests, Time elapsed: 335.744s [7998/52963 done]
INFO [2014-03-27 09:02:26] [0afe0] Archived website id = 988, period = day, Time elapsed: 44.702s
INFO [2014-03-27 09:06:58] [0afe0] Archived website id = 988, period = week, 0 visits, Time elapsed: 271.786s
INFO [2014-03-27 09:07:12] [0afe0] Archived website id = 988, period = month, 0 visits, Time elapsed: 13.597s
INFO [2014-03-27 09:07:13] [0afe0] Archived website id = 988, period = year, 0 visits, Time elapsed: 1.977s
INFO [2014-03-27 09:07:13] [0afe0] Archived website id = 988, today = 0 visits, 4 API requests, Time elapsed: 332.064s [7999/52963 done]