Update either dataset or array

Anyone have a set script or an example of this. I want to create either a float array or a dataset with 60 values. I want to read in a tag and then put that live value into element 0, move element 0 into element 1, element 1 into element 2, etc. The last element in position 59 can be replaced by 58. Essentially I want to keep track of the last 60 values in a dataset or array.

  1. Are you trying to create a first-in - first-out shift register?
  2. What is the clock source (that will trigger a shift)?
  3. You need to move 58 → 59 first, 57 → 58 next, etc. I you do it your way then you will fill the array with element 0.
  4. Often a more efficient technique is to not shift the array elements but have an array pointer which increments on each shift and wraps when the end is reached. The latest value is written into the element that the pointer is pointing to.

Can you clarify?

Yes first in first out. I want to do it every second if possible. True about moving 58 into 59, 57 into 58, etc. Your idea about the array pointer would work. Any quick example code you could share?

If shifting is entirely based on time interval, consider just using the recorder() expression function from my Integration Toolkit. Configure it to keep 60 rows at 1000ms intervals.

Thanks. I checked it out and looks like it would work pretty well. Do I have to import anything in order to use the recorder function?

You have to install the module (free) in your gateway. It adds a bunch of expression and scripting functions to your system.

I looked up the integration toolkit and couldn't find a direct link. Can you share the link or direct me on where to download it from? Thanks

Here, officially:

https://www.automation-pros.com/modulesales.html

and here (but links back to the above for download):

and here (for announcements):

1 Like

Thanks again. I was able to install the module and the recorder function came right up. With that in mind would I just be creating an expression tag of type dataset and the expression would be the recorder function?

Here is what I have but it is not populating...

I have tried four different ways to do the columns and row values including brackets and no brackets and the tag as a string or as a tag with '{}' around them.

Without an example in the documentation I am unsure of how to get it to work.

Do I need to have a separate dataset memory tag and then another expression tag with the recorder function where I can use the dataset in the recorder function versus the colName and colValue?

Hmm. Will fix.

The column name needs to be a string. The tag needs to be a curly-brace reference. Something like this:

recorder(
	1000, // Pace
	60,   // # of rows
	'Weight',
	{[~]Process Trending/Delaq/PV_Silo1TotalWeight}
)

If you are going to be subscribing to this tag in Vision (or monitoring in the designer), you should do this:

nonTransient(
	recorder(
		1000, // Pace
		60,   // # of rows
		'Weight',
		{[~]Process Trending/Delaq/PV_Silo1TotalWeight}
	)
)

If you organize your tags such that the recorder is in the same folder/UDT as the source values, you can use this syntax instead of a full path:

....
	{[.]PV_Silo1TotalWeight}
....

Be sure to set the recorder's expression tag to Event Driven execution.

Here are my results...

I am getting a configuration error for the expression dataset tag.

Suggestions?

I dont have to have an empty dataset already created correct?

How can I troubleshoot the configuration error for the tag?

No.

First, check the tag diagnostics. (Upper right in the tag editor.)

If not indicated there, wrap the whole expression in debugMe() like so:

debugMe('someKey',
nonTransient(
	recorder(
		1000, // Pace
		60,   // # of rows
		'Weight',
		{[~]Process Trending/Delaq/PV_Silo1TotalWeight}
	)
)
)

And look at your logs.

(I generally don't indent inside debugMe() since I expect to remove that after I find the problem.)

Also note this possible issue with recent versions of Ignition (not just for my toolkit):

Looks like its not accepting the function 'recorder'

Looks like that tag provider bug I linked. You probably need to restart your gateway. Or maybe just the tag provider.

I reset the tag provider and it worked! I did notice that whenever the tag was restarted all the values reset. I wanted to write the whole dataset to a memory tag so the values don't get erased if that happens. I tried setting a value change script on the expression tag to read in the expression tag dataset and write it to a memory tag dataset and thats not working....

Here is the expression on the expression tag value changed script...

Here is the memory dataset tag. I did not set any columns.

Do I have to set the memory dataset with the correct columns (empty) first? Do I have the value change script incorrect?

The memory tag is NULL and its not getting updated at all.

Do not edit/restart the tag. The function instance holding the data is thrown away by Ignition if you do. The recorder must start over when you do that. (Just like it has to start from scratch on gateway startup.)

You can manipulate what the expression does at runtime by moving the pace and rowLimit values to peer memory tags, and you can dynamically control the output column list by providing a dataset of sample values instead of explicitly naming and binding your tag(s).

If you need history in spite of tag resets, use the historian and/or transaction groups. Or, perhaps, use my toolkit's globalVarMap() infrastructure to make a long-lived jython variable with your content (scripted, non-trivial).

Side note: When you have a valueChange script on a tag, don't use readBlocking() to obtain the new value--it is handed to your script with zero overhead in currentValue.value.

I created a script I named FIFO to do just that, using Java's LinkedBlockingQueue:

from java.util.concurrent import LinkedBlockingQueue

def getQueue(name, capacity=4000):
	if not system.util.getGlobals().has_key(name):
		return LinkedBlockingQueue(capacity)
	return system.util.getGlobals()[name]
	
def setQueue(name, q):
	system.util.getGlobals()[name] = q
	
def addToQueue(name, value):
	q = getQueue(name)
	if q.remainingCapacity() == 0:
		q.poll()
	q.offer(value)
	setQueue(name, q)
	
def countInstancesOf(name, value):
	q = getQueue(name)
	ct = 0
	for val in q.iterator():
		if val == value:
			ct += 1
	return ct
	
def clearQueue(name, capacity=4000):
	setQueue(name, LinkedBlockingQueue(capacity)) 
	

Those are racy and prone to memory leaks. Always use .setdefault to reliably initialize a value in a dictionary when a check for the key's presence fails. When the value is an object that will be used all over the library, use the .setdefault technique to establish it as a top-level variable in the library.

Study this topic for the techniques needed to avoid crashing your gateway:

If you cannot use my globalVarMap(), you have to approximate it with an extra layer in system.util.getGlobals().

1 Like