Dynamically parsing out data from Table into Custom Property then back into SQL

I am able to get data from a SQL database to a Table in Perspective.

Let's say Table A has an Item ID and a percentage. This table could have 1 entry or 15 entries.

I would like to transfer these to a custom property.
Would I be able to use a for loop for this?

image
Can I rename these to whatever I want?

What do you want your custom property to be specifically?

Do you want it be in the same format as the key prop? If so you can bind your key prop to some custom property elsewhere.

The name of that key comes from the binding you have set. Whenever the binding updates, the names will go back to AMT and ITEM_ID (I'm guessing the column names in the database). You can't really change them by just clicking on them. You could change them with a script transform. What does your binding look like?

Let's say Weight and Location. As far as the format goes, it is more of I do not need ALL of the information, just bits and pieces.

I know I can bind each one individually, but was hoping to see if I could get this to work without having to do that.

You can do this with a script transform, e.g.:

return [
	{
		"amount": row["AMT"]
		"itemId": row["ITEM_ID"]
	}
	for row in value
]

Or, imperatively, if that's more readable than a list comprehension:

ret = []
for row in value:
	ret.append({
		"amount": row["AMT"]
		"itemId": row["ITEM_ID"]	
	})

return ret

However, if you just need to rename columns, then I would recommend using aliases in your query; transforms are significantly more "expensive" in terms of performance than directly returning the data in the format you want in the first place.

1 Like

I will be doing some calculations with the returned values.

Now for the fun part. I eventually want to take the information stored in this property and query a SQL database with information in this property.

I will need to use a for loop and a query string?

Don't run a SQL query in a transform.

Let's take a step back, lest we go down an XY problem rabbit hole.

You've got one outer SQL query returning a list of items.
You want to do some calculations on these items.
Then, at some point, you want to issue additional queries?

Can you more clearly outline what the flow of data is here? What are operators/end users doing to trigger a query to run, or, if they aren't, can this calculation happen entirely outside of Perspective?

1 Like


image

So here is the operator screen. When the Operator clicks on certain work orders, the custom properties will calculate out.


I am using the custom properties as placeholders until the operator gets the work order/run set the way they want it. I am building a query to send back to the database.

When the operator is ready, I would like to send the custom properties to a SQL database. (Probably off of a push button or something not automatically.)

I suppose these calculations could happen outside of perspective, but I think there will always be a need to edit or change something.

Flow of data is SQL to Tables, to Custom Properties, and back to SQL if possible.

Let me know if none of this makes sense.

Got it; that makes more sense.

I would say you're totally fine to 'stage' values in custom properties.

When you're ready to insert the data, I would at minimum have a named query to handle the actual data insert. You could also write a project library script (e.g. dataEntry with some function enterOperatorData) that's as "pure" as possible - it accepts whatever parameters you have to store, and sends them into the database. That way you're separating the data entry process (as far as DB logic) from any logic in your Perspective screen, making it easier to change one or the other independently.

2 Likes

If I am doing multiple inserts, would a for loop be the most efficient way to send this back to the database?

If you don't know the exact number of values, then yes, dynamically issuing N number of queries in a loop is fine.

For best performance/atomicity, you should probably wrap in a transaction: system.db.beginTransaction - Ignition User Manual 8.1 - Ignition Documentation

The network traffic to the DB is basically the same, but the database will 'hold' all the records until the transaction is finalized, at which point it can update indices/insert the actual data into storage in one operation.

What Paul said, but also consider dynamically constructing "Prep" inserts with multiple VALUES elements. JDBC allows 4k "?" parameters at a time, IIRC. This reduces the number of DB round trips.

1 Like

com.inductiveautomation.ignition.common.script.JythonExecException
Traceback (most recent call last):
File "function:runAction", line 5, in runAction
TypeError: toPyDataSet(): 1st arg can't be coerced to com.inductiveautomation.ignition.common.Dataset

def runAction(self, event):
	#txId = system.db.beginTransaction(timeout=5000)
	#status=2
	tableData = self.parent.parent.getChild("LeftContainer").custom.BatchData
	pyData = system.dataset.toPyDataSet(tableData)

	WorkOrderDataQuery = "INSERT INTO BATCH (WORKORDER_ID, FORMULA_ID, ITEM_ID, SOURCE_ID, BATCH_WGT VALUES (?, ?)"

	for row in pyData:

		args = [row[0], row[1], row[2], row[3], row[4]]
	
		system.db.runPrepUpdate(WorkOrderDataQuery, args)
	#system.db.commitTransaction(txId)
	#system.db.closeTransaction(txId)

image

I am going to assume I am seeing this error because I am trying push a datatype into different one. Am I going down the right path?

You don't need to coerce tableData into a PyDataset; you can directly iterate over it:

def runAction(self, event):
	#txId = system.db.beginTransaction(timeout=5000)
	#status=2
	tableData = self.parent.parent.getChild("LeftContainer").custom.BatchData

	WorkOrderDataQuery = "INSERT INTO BATCH (WORKORDER_ID, FORMULA_ID, ITEM_ID, SOURCE_ID, BATCH_WGT VALUES (?, ?, ?, ?, ?)"

	for row in tableData:
		args = [row['WorkOrderID'], row['FormulaID'], row['ItemId'], row['SourceID'], row['BatchTgt']]
		system.db.runPrepUpdate(WorkOrderDataQuery, args)
	#system.db.commitTransaction(txId)
	#system.db.closeTransaction(txId)
1 Like

Thank you so much for the help!