File Tracking?

Our customer is asking us to save images from a vision system and keep them for 3 Years.

Saving and storing the images is not a problem. What is a problem is ensuring every image is saved. Each image file name is serialized to the part manufactured and or MES would have record of what was manufactured so an image should exist in the file system for it.

Is it possible to use ignition to scan the file-system for image file-names and then alarm when they are missing?


You could certainly do this using Python, but why don’t you save them in the database as BLOBs? They’re then included in a database backup and won’t accidentally be deleted or moved.

I was thinking of that, but a lot of the chatter on google point to the cons of doing so. In terms of size I only plan to store about 250,000 1MB images at any given time.


Out of curiosity, what were the cons mentioned?

A Lot of the talk was speed and storage requirements. Seemed like the popular thing to do was store the path of the file in the DB.

We’ve gone through this process in designing a system for one of our customers. We started with storing the files in the database but their requirements grew hugely and we considered moving the files into the file system. However, the big problem we found with doing that was in getting the files to the server - this requires shares set up from every view node to the server which just wasn’t practical. It was much easier to use Ignitions built-in functions to read each file as an array of bytes and store them in the database as BLOBs.

In practice we have found that storing the files in the database works perfectly (we are currently using MySQL at that site). So far they have over 30GB of files and performance is great. Pictures don’t really compress further so I don’t see why people are complaining about storage requirements - they work out about the same both ways.

I would suggest keeping the attachments in either a separate table or even a separate database - this makes backing up easier. I would also keep the table stripped out to one column for indexes and one for the BLOBs - trying to do anything with a table of GBs of data takes forever, so keeping it to a minimum means you shouldn’t have to do too much to it apart from back it up.

So having said all of that!! Hows this strategy sound?

  1. Vision system stores file to NAS shared folder
  2. Gateway timer script searches folder for new image files not stored in DB.
  3. If file is not stored then GW script copies file to DB and deletes it on the shared folder
  4. MES build table is compared against table of stored images if MES build table exceeds stored image table trigger an alarm.

What I am really after is an alarm that alerts me to the fact that for some reason I am not storing images. Contractually I will be required to store 3 years worth. I would like to automate the audit procedure.


That sounds OK, although if step 4 runs before step 2, you may end up with spurious alarms. You may have to ensure you do something like check for new files every 5 minutes, but only alarm if you find a file on the disk but not in the database that is older than 10 minutes.

I forgot to say that if you’re going to store any meta information about the images, you should do this in a separate table and only include the index of the actual BLOB file.

It’s also up to you whether you delete the files. If you have plenty of space you may just want to move them to a backup store or change their filename in some way.

Yeah I would expect it to be a timer script and yes there would have to be some provision in the alarm pipeline to allow for the delay between image creation on the share and execution of the script to move it to the DB.

Know of any good examples on using python to read and store the images? Before i reach out to my friend google?

No need for Google! Ignition does all the heavy lifting for you - they’ve made it shockingly easy :slight_smile:

The information at should get you started in the right direction.

Doh!! Yeah that pretty much sums it up!!