500 or so individual customer databases on cloud hosted mysql(Mariadb) servers all of identical structure. One of the tables in each database holds binarydata (as blob) - JPGs & PDFs for the most part; attached to journal entries showing user activity stretching back years.
Those binarydata tables are getting very very large (>100gb) and i'm eating the cost of ever increasing disk capacity; most of the PDFs especially will probably never be viewed again but it's that "probably" is the problem - they have to be available because sometimes a user needs to look back over the history of their dealings with their customers (landlords and tenants in this instance).
I'd like to archive old files to cheaper online storage (e.g. https://aws.amazon.com/glacier/). There are existing connectors for mysql->AWS but they operate on a table level and i'd prefer to work record by record.
So i'm planning on running housekeeping jobs that scan for older records, keep the record with filename but move the file itself to AWS or whatever and mark the record as archived. If the user later tries to access that file they see a placeholder and i automatically trigger the background retrieval of the file and its re-insertion into the table.
Any better ideas or similar experiences?