I was asked to store 5000 data points from PLC into gateway tags and I wanted to create a big array to store these data. Is there a way to do that in the ignition UDT?
Is there a reason you want an array to store 5000 data points? What’s your end goal here? I only ask because there may be a better way to do what you are trying to do. I assume you already have 5000 PLC tags, or UDT’s that make up these 5000 data points?
Hi, the end goal is to store data from PLC to our database. Our team was thinking that the first step was to get these data from PLC to gateway and then do a SQL insert to store them into database. Yes, you are correct that I already have the PLC tags.
Is there a reason not to use historian? You can configure the size of change required to be recorded (or you can set it to record all changes/based on a time) and then they are easily used by things like the easy chart.
Alternatively, you can use the gateway tag change event, list your tags you want to be affected by name (possibly by folder) and the script something to record the events.
Usually context. If the 5000 data points are related to a single process or event, then you would want to keep them all together.
Without knowing what the values are for, it’s only speculation.
Yes, those data points need to be kept all together to produce a graph later on.
It would be more efficient reading the values directly form the PLC, instead of making 5000 tags.
arraySize = 5000 #path convention in this example is Rockwell-ish basePLCpath = '[DeviceNameInIgntion]path.to.float.array[%d]' pathList = [basePLCpath % i for i in xrange(arraySize)] valuesIn = system.opc.readValues("Ignition OPC UA Server", pathList) t_stamp = system.date.now() valuesOut =  for i, value in enumerate(valuesIn): valuesOut.extend([t_stamp, i, value]) query = "INSERT INTO table (t_stamp, row, value) VALUES " + ','.join(['(?,?,?)'] * arraySize) system.db.runPrepUpdate(query, valuesOut, 'dbConnectionName')
I see no reason to do any of this in the PLC. I doubt you actually need 5000 tags. Probably, you are storing the data and then pushing to SQL. Why not just push to SQL as the events occur? No need to store anything in the PLC, except a buffer. Most of us use a FIFO with enough room to cache data, use a transaction group to insert, then on handshake unload the FIFO, rinse and repeat. If data must be kept together, create a UDT.
I was totally wrong about the other method. I think this is a better way, but I wonder if I have 3 identical stations and 4 more data types to read, meaning the total data points will be 5000 times 3 times 5 = 75000 data points, would this method still be applicable and efficient?
It would be the most efficient, but are all these values really being logged at the same time? How often will they get logged?
Yes it would be ideal if all these values are logged at the same time as lab technicians need these data to create graphs or do calculation after every run. As for how often they get logged, it really depends on the tasks of the day. It is safe to assume we have 10 mins before the next run starts though. We also do not want to lock up the interface as we are writing data to the database. Appreciate your input to this!
A couple more questions:
- Are all 75000 points for one event, or are we really talking 25000 on three separate events? (i.e three separate test stations doing three separate things)
- Dang, I had another one, but now I can’t think of it.
Either way, on the surface, it seems like it con be done in 5000 point chunks.
The three stations are running the same test at the same time but they are all collecting their own data points
BTW, ideally I would like my query statement to be like this query = "INSERT INTO tablename (runNumber, equipment, stationID, index, Pressure1, Flow1, Pressure2, Flow2, Flow_Ratio), VALUES " + ‘,’.join([’(?,?,?,?,?,?,?,?,?)’] * arraysize)
I tried to use your code to insert just 10 rows for now to the database but I kept getting errors.
You’ll need to show us your code and errors. I can’t quite see it from here.
Paste your code into your post, highlight it and hit the ‘Preformatted text’ button at the top of the editor.
I would start with getting rid of the comma before VALUES in the query that I accidently put in my example.
The three stations are running the same test at the same time
lab technicians need these data to create graphs or do calculation after every run
runNumber, equipment, stationID, index, Pressure1, Flow1, Pressure2, Flow2, Flow_Ratio
I’m guessing these runs last a period of time and if that’s the case, I would imagine adding a timestamp column would aide in creating graphs
edit: nvm I see
Sample_Num in your next reply
I found the same thing as well. One thing that stood out to me is that using the for loop to get everything into valuesOut all at once might not work, or I am just bad at this… I had to make the following modification in order for it to work.
valuesOut =  for i in range (0,3000): valuesOut= ([runNumber,equipID, stationID, i, v[i], v[i], v[i], v[i], v[i]]) query = "INSERT INTO tablename (runNumber, equipmentID, stationID, Sample_Num, Pressure1_PV, Flow1_PV, Pressure2_PV, Flow2_PV, Flow_Ratio) VALUES " + ','.join(['(?, ?, ?, ?, ?, ?, ?, ?, ?)']) system.db.runPrepUpdate(query, valuesOut)
I remembered my other question! How often does the sampling take place? As mentioned before, many use a FIFO to do the heavy lifting in the PLC. If you can make a UDT on the PLC end that can hold the FIFO data, that may also be an option. The plus side is that you could start logging data right away from the beginning of the test.
You are really making this more difficult than it needs to be. As I thought, each test doesn’t need 5000 points. Dump that idea and move to a simpler approach.
Start with the data, is the data categories always the same per station? If so, then add a column for station ID and as stated add a timestamp. Then you can query all day long and graph to your hearts content.
If you knew the basic understanding you would have never suggested 5000 points.
The PLC scan rate for these data is much faster than the scan rate in ignition which is why we needed to read and write all the data after the test. Also, we already have UDTs in the PLC to store all the data if this is what you meant. I just need to move them into the database.