Does anyone have a formula that will allow me to calculate the ammount of disk space required to historical archives?
Determining the size of your database really does vary depending on your setup. There are a lot of variables to take into account. Our historian logs data by exception and has the ability to compress the data. So if a value hasn’t changed in 10 days nothing gets logged. So, there are 3 varibles to look at:
Number of tags
Rate of change - % of tags that change every scan cycle
Compression - how the historian compresses the data
We try our best to keep the size of the database to a minimum. So when we estimate the size we like to think in terms of worst case scenario, meaning logging every scan cycle and no compression. So let’s say you have 1000 tags logging data every 10 seconds. That means every minute you will have 6 records for every tag. Each record for the historian is around 60 bytes.
Let’s do some math:
1000 * (6 * 60 * 24) * 60 = 518.4 MB / day
You can see the size is linear based on the number of tags. On average we see around 54% compression so if we take that into account:
518.4 * 46% = 238.46 MB / day
Now if you take into consideration rate of change, if only 10% of your tags are changing every scan cycle you can expect:
238.46 % 10% = 23.85 MB / day
Hopefully this gives you a good idea.
Using Travis’ approach on an existing project, you should be able to see how much actual data you are logging in a given time period and multiply it out. For a new or expanding project it’s better to estimate how he did.
@Travis.Cox a google search led me to this post. I’m helping a customer size a new system.
Does this rule of thumb still hold?
Yes this rule of thumb still holds true. Here is a spreadsheet that you can use where you just fill in the numbers: