Related to my other thread, I am at a point where I am ready to start importing the data exported from Rockwell Historian OSI PI. I have all ~1850 tags imported, but when I look at the sqlth_te table in the database, I see there’s only 1711 tags and I realize that it doesn’t actually get inserted into the table until it has valid data. Unfortunately, several data sources are offline and I can’t say when they will be back online. If the tag doesn’t exist in the historical database, I can’t import the data since it needs to reference the tagid. So I’m a bit stuck at the moment. Do I need to import them all as memory tags so that they get initialized in the database, and then re-import them as OPC data sources so they can have actual data sources? Hopefully I explained that well enough.
You shouldn't be importing data by writing directly to historian tables, you should be using system.tag.storeTagHistory or the corresponding system.historian.storeDataPoints function if you're on 8.3. Then you just specify a path, because the "tagid" is an implementation detail of the SQL based historian, not an artifact of the tag.
8.1
This is 15 years of data, so that method won’t work. Direct import is the only option.
Maybe not the prettiest solution… but can you just insert a single data point into every tag? Then you can delete it after importing
Probably could’ve, but that would be a lot of work.
I ended up just modifying my tag import file to change them all from OPC to Memory. They immediately all had entries in the database. Then I changed them back to OPC and imported them again, overwriting the existing ones. That should do the trick. ![]()
Not with a script, it’d be a few lines ![]()
Thanks, I’ll look into it. I’ve reached out to IA to see if there’s a more efficient way of getting all this data in a bulk import since I’ve already written the scripts to export the OSI PI data, use a lookup table to create a format using a tagid, and then export into a structure that matches the format of the PostgreSQL tables. The other thing that may be causing some trouble is we are using TimescaleDB with PostgreSQL so my imports, while appearing successful, are not showing any data in the database tables, and not showing data when using the “Easy Chart” in a window.
I’d insert some history for different months for a single tag and see how it inserts it into the various tables. I don't know how it works with Timescalesb, but normally there are partitions and data is inserted into the correct partion table, eg the table with the right yyyy-mm. Additionally, check the dates in the tag table are correct and go back far enough for your old inserted data. They could be cutting it off
We don’t use partitioning. I believe it was actually recommended NOT to when using Timescale because they it breaks it off into hypertables. Dates are correct, already verified that. Like I said, when I look in the Easy Chart, the imported data is not showing. But it’s also not appearing in sqlth_3_data like I’d expect to see.
Bingo, I think I found the smoking gun. The sqlth_3_data partition reference had a start_time that was more recent. When I changed that to a time that was before the earliest imported data, the data suddenly appeared. So yes, bulk imports directly into the DB do, in fact, work. Just need to be very careful with the alignment of all the data.
