Copy a record via a push button

I have a recipe table set up which services several machines. I have a transaction that is triggered by the PLC and uses a where clause for the part number to do a DB->OPC transfer. Everything works there, no problem. the problem starts now because up until recently parts were dedicated to certain machines. Now we are allowing multiple machines to run a part number which is fine except each recipe needs to be “tweaked” for the individual machine. To do this we added a machine number column that uniquely identifies that part number record for each machine.

Now I have a lot of recipes that need to be duplicated both now as well as in the future so I wanted to be able to copy a complete recipe (record) and update the machine number. So essentially duplicate the entire recipe row with the exception of an updated machine number column. Is there an easy way to do this other than the component button scripting on mouse action clicked that runs the system.db.runPrepUpdate for all of the 120 individual recipe columns?

it would seem that if I can already query this into a table or custom dataset and already have all my data formatted I should be able to more easily create a new record. without the mind-numbing typing.

I’m not sure I follow, is the mind-numbing typing the writing script for the prep update on button push?

[EDIT] Okay I think I see where you are going with this. writing an update query for all 120 rows is a lot to type.
You can use a query to do this.

insert into recipes
select*from recipes
where id = ?

this will copy all the columns and generate a different primary key using MySQL. what db are you using?

Yes, it is at least partially that, and also the updates that have to then happen any time a column is added to every place that they might want to add/copy a recipe. Add to that the awful readability of the line where I assign the 120 fields for the update/insert statement. I was just hoping for a quicker easier way since I can query the existing recipe and already have all the data in a dataset that matched the table perfectly.

I am using MSSQL located on a remote server and not administered by me (nor do I have rights) so everything I do must go through ignition to be able to have the rights to do anything other than read.

This doesn’t seem to have an easy answer that I can think of off the top of my head. You can

system.db.runNamedQuery('CreateTempTable',)
system.db.runNamedQuery('UpdateTempTable',)
system.db.runNamedQuery('InsertTempTable',)
system.db.runNamedQuery('DropTempTable',)

So basically you create a temporary table based on your recipe table, update the columns you want, then insert this into the recipe table, then delete the temporary table.


CREATE TABLE temp_table as SELECT*FROM recipes
UPDATE temp_table set machine_id = 3
UPDATE temp_table set primary_key = null
INSERT INTO recipes SELECT*FROM temp_table
DROP TABLE temp_table

Maybe this can help you?

It may help, at least in that There seems to be no obvious easier way. As for the creating the temporary table and such, I can follow the [quote=“dkhayes117, post:5, topic:19689”]
CREATE TABLE temp_table as SELECTFROM recipes
UPDATE temp_table set machine_id = 3
UPDATE temp_table set primary_key = null
INSERT INTO recipes SELECT
FROM temp_table
DROP TABLE temp_table
[/quote]

However I have no idea how to implement this into the button to make it happen, I also have never used named queries. I am a database kindergartner and an Ignition 3rd grader. I am just now learning not to eat paste.

1 Like

Some older version of Ignition do not have named queries, I think it was introduced in 7.9.4. What version do you have?

I am currently using 7.9.4 and yes, I do have named queries on my tree, I just haven’t used them yet.

Okay, so how about this, is there a way to store a pydataset directly to a table? I have all of my queried data in a pydataset and can move it down to a table one cell at time, could it be pushed out as one dataset as an insert somehow?

Not that I know of, the dataset has to be iterated through on an insert. When you bind a table to a query, the table data from the query is already a dataset. Moving it as a pydataset to another table doesn't really change anything. You may look into setting up a script to export the data as a csv, then import that into the db. I've never done this, and I don't use MSSQL so you may have to research the method to do this. csv import and the very repetitive runNamedQuery method is all I can think of. There maybe a better solution that someone else can come up with?

1 Like

Okay, so I have almost everything working on this little project. My last hurdle is in relation to the pydataset values that are returned from NVARCHAR’s queried in the SQL data. When I go to write the variable back into a table using reprepupdate I get an error which I believe the most important part is the text “The conversion from UNKNOWN to UNKNOWN is unsupported”
and I assume this is because of a type mismatch between the pydataset and sql. They pydataset value is: u’Kalcor 709’ while the result from an sql query just returns just the text without the u’XXXXXXX’

I have searched the forums and I have seen some information regarding this, but now how to change or coerce this data into a format acceptable to sql.

Without seeing anything else, you can convert the value to a string - str(value). Then replace the u’ and ’ with replace() if needed

Post your code. u'XXXXX' is just the Python repr() output for a unicode string - which is exactly what an NVARCHAR column is. You’re probably still a layer inside the PyDataset (eg, you’re trying to insert a PyRow) rather than directly inserting a string.

1 Like