I have a project with 28000 tags saving values to historian each second. There is a table by day, with a size of 30-40GB each one, in PostgreSQL.
Tables for days older than 6 months, is not needed to continue storing each second value. They can be summarized to a rate of 10 seconds. So we will save disk space.
A solution to summarize a table it would be to make a script that creates a historian for this day older than 6 months, and for each tag in this table, call system.tag.queryTagHistory with an aggregation of 10 seconds. And then, insert the resultset in the new table. At the end, I will delete the old table (with 1 second values) and rename the new table to the old one. In this way, is transparent for export scripts and graphics. But I think the script it will be very database expensive: for each of the 28000 tags, it has to know what type of tag is (query database), call system.tag.queryTagHistory (slow query and RAM costly) and insert the result in the new table.
What other approach I could use?