Update with multiple zip files - max execute time

Hello,

Updating matomo is a pain when you have a shared hosting, i have max 30 seconds to run a php file. The update is running with jquery that triggers every step after step and when it tries to download the new matomo zip file it dies (max execute time) or when it tries to unzip it.

If the updater can be changed that it uses 4 steps to download the zip files (4 zip downloads in 4 new php requestes) then it wood not time out and everything works fine. If then the zip files get unzipt after eacht other in new php requests. then the update wood finish.

Matomo works great on my hosting but only the updater get’s a time out.

Now:
-> Checking files
-> download new zip of matomo server
-> unzipping
-> update database

I propose:
-> Checking files
-> download new zip of matomo server
----> file 1
----> file 2
----> file 3
----> file 4
-> unzipping
----> file 1
----> file 2
----> file 3
----> file 4
-> update database

Hi,

Disclaimer: What follows below is my opinion (so I might be completely wrong there and other people might also disagree here).

I think adding this feature is a bad idea.
I quickly tested it and unzipping the matomo.zip on my PC takes 0.473s. On my webserver (which is a cheap virtual server on hetzner cloud) it takes 0.927s. (and downloading takes less than 2s)

So if the server of your webhost is not able to do the same without taking more than thirty times as long as my server (which is probably cheaper than what you are paying to your web host) then I honestly have to blame your webhost for it.

What you propose could be implemented and would probably make the update more smooth in your case. But it would also add complexity to the build process (splitting into equally sized zip files) and the updater (properly redirecting users from one step to the next one). Which means new bugs can be added and new issues can occur during the update for some other people.

And most importantly it would mean that you would still only have the bare minimum of usability, someone would have to spent time implementing this and the only ones benefiting would be your host (as they would not need to update their slow servers as people seem to be happy with the minimal service they provide).

As you might have noticed I am not a big fan of (some) shared hosters and I have had bad experiences in the past where people were persuaded to sign a contract for hosting with an ancient PHP version, ridiculously little space (250MB including log files, config files and everything else), contracts that could not be cancelled at a reasonable time scale and all of that for far more money than a real webserver with oneself as the only user would cost.

I get that there are very good reasons to use shared hosts, but I really recommend everyone to compare and not stay with someone who thinks renting out a completely overloaded server to hundreds of people while disallowing them any opinion on the PHP config is an acceptable business.

Sorry for the rant :slight_smile: