Good question. We have a very cool and efficient way of doing logging historical data by taking snapshots of a “realtime” table periodically. We will create a different table with the same column names, but different keys to log the data to. Here are the steps:
- Create the similar historical table
CREATE TABLE air_unit_history LIKE air_units
- Deselect the autoincrement property on the air_units_ndx column . Drop the primary key on air_units_ndx and add it to t_stamp AND AU. This is REALLY EASY. When editing the table in the MySQL Query Browser, uncheck the “AUTO INC” checkbox. To change the primary key, click on the icon to the left of the column name. A blue diamond is a regular column and a yellow key is a primary key. We just created a new joint primary key. A query similar to this should have been generated for you:
ALTER TABLE `mydb`.`air_unit_history` MODIFY COLUMN `air_units_ndx` INTEGER NOT NULL DEFAULT 0, DROP PRIMARY KEY,
ADD PRIMARY KEY(`t_stamp`, `AU`)
Ensure that each “realtime” FactorySQL group has Store time/date stamp selected in the Action tab of the group properties. This can be accomplished by selecting every FactorySQL group and checking that property once.
Create a query that will read the entire realtime table, then “log” it into the historical table. Run a few times to ensure that it works.
INSERT INTO air_unit_history SELECT * FROM air_units
Now we create a single FactorySQL group to run this query. Set the Update Rate to run as frequently as you’d like to log data. You’ll need to give it a table name (this won’t really be used). I’d suggest dummy_table. Select update/select first record.
In your new group, create an Action Item of with Item Mode set to SQL Query. put your query under Command. Ensure that Execute this item on every update interval (add to trigger list) is selected - it will be by default.
That may sound like a lot, but it’s an efficient way to log what could have been a lot of data to a single table, with a single FactorySQL group and query.