Upgrade with large database

I’m testing the upgrade from 2.4 -> 2.16 and so far it has been running for 20 hours at query 96 / 194. The database is fairly large (120GB). It tracks 13,000 websites.

The DB server is a fairly high spec’ed VM on Azure (4 core / 14GB RAM / SSD disk) and MySQL has been performance tweaked as best I can. The bottle neck is most certainly the disk here as the CPUs are fairly idle and showing high I/O wait.

I can’t use Redis given the version but I could maybe replay the logs. Are there any tips or tricks to make this as painless as possible / what are people’s opinion on the best way to proceed without losing tracking or denying users access (via the API).

Thanks in advance.