I need to SELECT data from MsSQL table and then insert to Mysql

If I had admin rights to the MS SQL server I would,

  • Create a buffer table with the same structure as the monitored table.
  • Create an INSERT trigger on the monitored table. This would copy any inserted rows into the buffer table.
  • Create a query tag to count the number of rows in the buffer table.
    SELECT COUNT(id) FROM buffer_table
    where id is the primary key or another field name.
  • Create an onChange event on that tag and get it to call a gateway script.
  • The gateway script will,
    • Read the buffer table.
    • Write the data to the MySQL database.
    • Delete the rows from the buffer. (Note that you don't delete all the rows in the buffer since new data may have arrived since you read the buffer table. You need to specifically delete the rows actually transferred.)
2 Likes

It already worked for me in a similar way, but when the table is very large, it works for a long time (there are about 10 thousand records)
But in this way, as I wrote the code from above, is it possible to do this?

I solved issue but idk about change data.

def insert_update_tables():
    query = "SELECT * FROM transfer_data"    args = []
    result = system.db.runPrepQuery(query, args, 'DB')    
    query2 = "select cast(_IDRREF as bigint) as id, cast(_Fld11787RRef as bigint) as st_in, cast(_Fld11786RRef  as bigint) as id_num, cast(_Fld13891RRef as bigint) as shift from _Document11782 where _Date_Time >=dateadd(day,-5,(DATEADD(year,-2000, SYSDATETIME())))  and _Marked =0 and _Fld11795=0 and _Posted =1"    args1 = []
    result2 = system.db.runPrepQuery(query2, args1, 'ALK')    
    if result.getRowCount() > 0:        pydata = system.dataset.toPyDataSet(result2)
        pydata1 = system.dataset.toPyDataSet(result)
        # result2 = lima_1c        # result1 = Mysql custom table
                arr = []
                editSet = set(tuple(row) for row in pydata)
        masterSet = set(tuple(row) for row in pydata1)        
        deletions = [list(row) for row in (masterSet-editSet)]
        changes =[list(row) for row in (editSet-masterSet)]        
        print deletions        
        if len(deletions) == 0:            pass
                if len(deletions) > 0:
            if result2.getRowCount() < result.getRowCount():                for datacon in deletions:
                    system.db.runPrepUpdate("DELETE FROM transfer_data WHERE id = ?, st_in = ?, id_num = ?, shift = ?", datacon, 'DB')                    print "Deleted some row"
                    pass            
            elif result2.getRowCount() > result.getRowCount():                for datacon in deletions:
                    datacon = [long(value) for value in datacon]                    system.db.runPrepUpdate("INSERT INTO transfer_data(cast(id as bigint) as id, cast(st_in as bigint) as st_in, cast(id_num as bigint) as id_num, cast(shift as bigint) as shift) VALUES(?, ?, ?, ?)", datacon, 'DB')
                    print "Insert data here"                    pass
                if result.getRowCount() == result2.getRowCount() and changes > 0:
            for change_row in changes:                change_row = [long(value) for value in change_row]
                system.db.runPrepUpdate("UPDATE transfer_data SET st_in = ?, id_num = ?, shift = ? WHERE id = ? LIMIT 1",change_row, "DB")        
        elif result.getRowCount() == result2.getRowCount() and changes == 0:            pass

    if result.getRowCount() == 0:       
             for index in range(len(result2)):
                   datacon = list(pdsData[index])            
                   system.db.runPrepUpdate("INSERT INTO transfer_data(id, st_in, id_num, shift) VALUES(?, ?, ?, ?)", datacon, "DB")

Thank all for help!
It's very cool that they answer quite quickly and give good advice on the forum

Your code indentation and some line breaks are all messed up. Can you edit to fix?

Python syntax allows multiline strings using the """ syntax. This can improve code legibility greatly.

query2 = """select 
              cast(_IDRREF as bigint) as id, 
              cast(_Fld11787RRef as bigint) as st_in, 
              cast(_Fld11786RRef  as bigint) as id_num,
              cast(_Fld13891RRef as bigint) as shift 
            from _Document11782 
            where _Date_Time >= dateadd(day,-5,(DATEADD(year,-2000, SYSDATETIME())))  
              and _Marked = 0 
              and _Fld11795 = 0 
              and _Posted = 1
          """
1 Like

I'd also suggest not putting the if/loop conditions and bodies no the same line.

edit:
What's the point of casting here:

INSERT INTO transfer_data(
	cast(id as bigint) as id,
	cast(st_in as bigint) as st_in,
	cast(id_num as bigint) as id_num,
	cast(shift as bigint) as shift
)
VALUES (?, ?, ?, ?)

Those are just the names of the columns you want to insert into, or am I missing something ?

It`s just a colums)
I already solve the task
TY

Your script doesn't use a transaction. That means a separate task in Ignition that looks up anything in the MySQL table will be disrupted while the update is running. With a transaction, other users of MySQL will "see" either the old version, or the new version, but not the intermediate state while you are inserting rows.

Ohh, idk how to fix this
Maybe some ideas?

Follow the link in my prior comment to the documentation for beginTransaction(). Study the examples. Try it. Come back and tell us if you have difficulties.

I also recommended that you use multi-value inserts instead of one row at a time. You can't do all ten thousand rows in one insert, but you should be able to do a few hundred at a time. This will dramatically speed up your script.

{ Tip: When someone on this forum gives you specific, step by step instructions to solve your problem, follow all of the steps. }

1 Like