Hello-
I have a script that opens a .csv with somewhere between 5-10,000 rows, and then inserts this into a SQL table. It works just fine but runs fairly slow (I think). The script takes about 15-20 seconds to execute. I’m wondering what best practices I would use to make this run faster if 15 seconds is much slower than you would expect that kind of task to take? I imagine the issue is calling system.db.runPrepUpdate 5-10,00 times? Is there a better way to get all this data inserted at once?
Here’s a simplified version:
import csv
# get file path of upload and then convert csv data to dataset
csvPath = event.source.parent.parent.getComponent('Container').getComponent('File Explorer').selectedPath
# open selected csv file
csvFile = open(csvPath, 'rU')
print csvFile
# read selected csv file
csvRead = csv.reader(csvFile)
csvDataset = []
for row in csvRead:
csvDataset.append(row)
csvFile.close()
# get upload time
uploadTime = system.date.now()
# start db transaction
txID = system.db.beginTransaction('Quality', timeout=10000)
# delete all orders from table
query = "TRUNCATE TABLE inventory_dump"
system.db.runUpdateQuery(query, 'Quality', tx=txID)
# get inventory timestamp
inventory_timestamp = event.source.parent.parent.getComponent('Container').getComponent('inventoryDate').date
for row in range(len(csvDataset)):
if row > 0: # ignore header row
item_number = csvDataset[row][0]
quantity = csvDataset[row][3]
# insert orders into sql table
query = "INSERT INTO inventory_dump (dump_timestamp, item_number, quantity, uploadTime) \
VALUES (?,?,?,?)"
args = [inventory_timestamp, item_number, quantity, uploadTime]
system.db.runPrepUpdate(query, args, 'Quality', tx=txID)