Hello all -
we have an IA Historian system that is less than one year old; we are still configuring more history tags, currently 6,000+ tags, and we will probably end up with 10,000 or so. Our SQL dB size is about 35 GB. There is only ONE partition, on purpose. Keeps the query interface for reporting simpler, and we don’t intend to every archive the old data, at least not for 6 years. IT Just had expand our SQL VM to 80 GB storage size as we had failed logs due to full disk. IT is experiencing problems backup up the transaction log. They take 6+ hours to complete, then the next ones want to start, and they fail. Backing up from SQL takes only about 15 minutes. Does anyone know if Commvault SQL agents are supported for this app? Just looking for best way to back up historical data. Since this is a tag history database, once the tag information is laid down to disk, it isn’t changed. Just sits there for queries. Doesn’t seem like this should be an issue be it is. Any suggestions? thankyou.
How are you configuring the tag history? I am wondering if you’re collecting a bunch of extra data that you don’t actually need.
Paul, I am new to this forum, but I have used Commvault SQL Agents on SQL databases. So they are supported in SQL. You might increase the frequency of your log backups to cut time. My log backups run hourly, taking about 10 min’s to backup a 40 GB log file that is about 80% capacity. If your log backup is slow, and your log is going to the same place as your database backup. Then it is my belief that your log configuration might be to blame, especially if the log is using a default growth setting. Everytime a block is to the transaction log, it adds VLFs to the transaction log. Too many can slow down your backups. Run DBCC loginfo() on that database and you can see number of VLFs. www.sqlskills.com has a great write up about VLFs and performance. Good news, it is fixable.
Hello Paullys50 -
we have created standard historical configurations for tag classes, based primarily on engUnit, and sometimes engUnit and additional criteria. We set the historical deadband mode, the historical deadband, and historical scan class to default values. In general our historical scanclass is 10,000 mS.; we set the historical deadband to a value that makes sense based on administrative requirements first, then instrument accuracy or required process capability second. Are we collected data that we don’t need? good question. It is not easy to answer. Every thing I have experienced over many years as well as read about process historians is that it is difficult to predict how the data will be used; but one thing for sure is if you don’t collect it you can’t analyze it.