Best tag type / method of creating a queue of orders for a station

I have a few stations I'm tracking assembly progress on. Its a simple setup where, to begin the process, I select an order from a table and then hit Begin. Right now, it won't let you begin an order if there's an assembly at the first station, which works fine.

For the stations after the first one, it doesn't work so well. Say, I finish my operations at Station A and hit a Complete button (which captures cycle time). How I initially configured it was that as soon as I finish Station A, it will automatically take the appropriate tag values in Station A (like a serial number) and write them to Station B's tags.

This is a dumb setup, because I could be in the middle of an operation at Station B and it will end prematurely.

I would like to implement a queue of incoming serial numbers to a station to solve this problem. That way, if an operation is in progress at Station B and Station A completes it's operations faster, Station A's cycle times will be accurate and Station B won't get bad behavior.

A queue is easy to implement in Python, but I would like to duplicate it within the Ignition tag system. Which tag data type should I use, String Array (an array of serial numbers) or a Dataset? Keep in mind that I'll be appending to and popping from this queue regularly, so whichever tag type facilitates that better is probably the better one for me to use.

Hello,

I would think this would be done in the plcs controlling the process and Ignition would simply monitor and record this data. I don’t know how time sensitive your data is but you are at the mercy of tag group settings and opc latency.

Frank

1 Like

That's a good assumption to make for most facilities but this one uses pen and paper and a "data warehouse" to them is a literal warehouse full of boxes of records.

Right now, the operator will both start and complete the operation. There are process values saved to a DB through transaction groups, but they don't really trigger writes to any tags and don't affect the process in any way (which makes sense, given the stage of digitization they're in).

I'm also omitting that there is an MES in this mix and this setup is partially to accommodate the facilities unique conditions and still record the appropriate data for use by the MES.

I should add here that I can probably figure out the actual scripting involved to get this working.. What I am wondering is if I'll run into any performance issues (memory leaks and so on) by using either a Dataset tag or a String array tag, since Datasets are immutable and every time I write to or pop from a Dataset, it'll create a new one that I'll write back to the tag.

How big can the queue get?

Perhaps a global queue as a top level variable in a project script? If you need to you could persist it across restarts as well.

If it’s not going to get very large and doesn’t need to be serialized I would probably use a document data type and store it as JSON

The queue should never be over 60 (their 100% "OEE" target rate is 60 a shift) - I would imagine it would be at most around 10 serials.

I saw the Document tag - I've never used one before, so this might be stupid, but JSON is inherently unordered, whereas order is important for a queue data structure.

I believe a tag is the appropriate location within Ignition to store this; it is globally available and will persist through logouts, restarts, and other malarkey. It is also very easy for me to add to the Station UDT I have created already, so as much as I'd rather create a queue using Python, my potentially incorrect gut feeling is that it should be in a tag.

I think a dataset may be simpler, given the nice data set script library. I think if you include time stamps, order could be obtained via sorting before operating on your data.

You mentioned they are already leveraging databases, as a plan b you could maintain your queue list in the database.

1 Like

If you are recording progress to the database anyways, you can create a query that yields the next operation for a given station. That might be most efficient.

My concern with a memory dataset tag is that they are stored in the internal database.

If the size isn't too great then it probably isn't a major concern.

Technically Datasets and Databases aren't ordered either, you have to enforce that order. The same can be done with JSON.

With the size of LOAD you're currently looking at probably not an issue with any direction that you choose, however, if there is ever going to be the intent to scale this up, then you might want to plan for that.

I think I would agree that if you have a database at your disposal then that might be your best option. An added bonus is that you can also then have a historical record.

1 Like

Is this in the docs somewhere that I missed? I only tested this with a single incoming order, so it might become obvious when I do a more rigorous test. Right now, I'm popping from and appending to a dataset, which heavily relies on that order not being arbitrarily changed on me.

Uhm, no. Datasets are explicitly ordered for both rows and columns. For DB supplied data, the order is as returned from the DB. For manually constructed datasets, both are ordered as constructed.

2 Likes

Here is what I ended up doing, which I already demoed to the customer and it seems like it'll work just fine:

  1. Create a Dataset tag in the Station UDT (this will vary on your implementation). I created a Dataset with only two columns: a String column called Serial and a Date column called Timestamp.

  2. Add the following scripts:

def addSerialToQueue(tagPath, serial):
	"""
	Appends a row to a dataset located at <tagPath>
	
	Parameters:
		- tagPath of the serial queue
		- `serial` is the serial number to append
	
	Return:
		- None
	"""
	# get dataset
	serialDs = system.tag.readBlocking(tagPath)[0].value
	
	if serial:
		# modify serialDs with new addedRow
		serialDs = system.dataset.addRow(serialDs, [serial, system.date.now()])
	
	    # write back to tag
	    system.tag.writeBlocking([tagPath], [serialDs])
	
def popSerialFromQueue(tagPath):
	"""
	Pop the first row and delete from dataset
	before returning it; located at <tagPath>
	
	Parameters:
		- tagpath of the serial queue
	
	Return:
		- serial number to run next or None
	"""
	serialDs = system.tag.readBlocking(tagPath)[0].value
	serial = None
	
	if serialDs.getRowCount() > 0:
		# pop current serial
		serial = serialDs.getValueAt(0, "Serial")
		
		# modify DS by deleting first row
		serialDs = system.dataset.deleteRow(serialDs, 0)
		
		# write back to DS tag
		system.tag.writeBlocking([tagPath], [serialDs])
		
	return serial
  1. Determine the triggers you want to use to append and pop from SerialQueue (could be tag change, button action, etc). Call the appropriate function to get or set the data you need.

Added benefit

Since the dataset tag is a Dataset tag (a wordsmith, I know), it's very easy to get the SerialQueue into a Table - just bind the Table's data property to the appropriate station's SerialQueue tag (which is part of a Station UDT for me). This ensures that accurate information will always be displayed in the station - whenever a serial gets appended or popped, it'll appear that way in the table immediately without having to refresh any bindings. Right now, I have a Table of incoming Serials to a station and an operator needs to select one and hit a Start button to start the cycle time.

This was my hesitation with using a DB table for this: I need to make sure to refresh the table component (either by polling or message handlers) every time a change occurs to SerialQueue. Using a Dataset tag works flawlessly for now and really makes this system work well (in dev - we'll see if this carries over to production, where we have those pesky operators always messing stuff up)

2 Likes

The only caveat here is, if you restore from a backup, this tag gets overwritten. Not thtat it ever happened to me... :roll_eyes:

1 Like