Database size using historian

How do you guys/girls handle databases and purging when logging lots of values ?

Most of my data comes in over MQTT based on value changed so I can be getting messages every few seconds. Some of that data is just displayed as a live value but say half is logged using the history feature in the edit tag. I then use it to populate charts. Other data is also logged to the database using transaction groups, this is mostly QC data that we want to keep forever.

I’m consuming about 250mb a day so my hard drive space is being used up fast. I use a cloud based service for hosting ignition so hard drive space isnt cheap. We are looking to switch over to hosting it on our server so we can throw in multiple TB hard drives but the problem will come eventually.

When setting up a transaction group you can set it to delete records after x. How’s is this handled with historic tags, is there a feature to start deleting records after so long ?

Am i doing something wrong with the history provider consuming so much data ?

I cant speak to the last question, but you can configure pruning:

1 Like

More on pruning:
https://docs.inductiveautomation.com/display/DOC81/Tag+History+Providers#TagHistoryProviders-DatasourceHistoryProviders

You might want to evaluate the deadband settings on your tags? If you don’t need every single point logged verbatim, you can tweak the deadband so that values aren’t stored as frequently, leading to less data being stored:
https://docs.inductiveautomation.com/display/DOC81/Configuring+Tag+History#ConfiguringTagHistory-DeadbandandAnalogCompression

1 Like