Tag History to CSV and Back how do i fix ignition crash?

Greetings,

I have a script that successfully generates a tag historian from a CSV file. However, I'm encountering an issue when attempting to restore the CSV file to the SQL database.
The problem arises due to the sheer volume of data, as there are millions of rows in the file. I'm curious if anyone knows of a viable solution to efficiently transfer such a large number of rows back into the database.

csvRead = csv.reader(csvFile),
csvDataset = [ ]
for row in csvRead:
    csvDataset.append(row)
csvFile.close()

antone now how to make this faster?

Please format code as code using the </> button. It converts this ...

... into this ...

csvRead = csv.reader(csvFile)
csvDataset =
for row in csvRead:
    csvDataset.append(row)
csvFile.close()

... but you seem to be missing something on the second line.

tnx for the tip.

what i was missing was just a emty List.
but that was not the problem.

Some consideration points:

  • It will try to put the entire dataset into memory.
  • As a general rule the db will only allow inserts of up to 1000 rows.

I would suggest breaking your process into 1000 row chunks.

hello tnx for your reply.

I think the Database isn't the problem here.
because this is an insert per row.

but ignition crashes when the csv becomes a List.

and i don't now how to change this.

If you are inserting per row, then you don't need to make csvDataset. Just insert the row right away.

Inserting a row at a time is going to be less efficient. A series of ops should be something like:

csvRead = csv.reader(csvFile),
csvDataset = []
count = 0
for row in csvRead:
    csvDataset.append(row)
    count += 1
    if count >= 1000:
    	insert_into_DB_here()
    	csvDataset = []
    	count = 0
if count>0:
	last_insert_into_db()

csvFile.close()

If you use system.db.runPrepUpdate to insert your rows, I do have a helper function to create the question marks for you.

Consider not doing quite so many. IIRC, JDBC supports only a few thousand question marks per statement.

2 Likes

tnx i think this schould work.

i am only not sure how to do this. but will try

how mutch will be the best?

well it works ignition doesn't crash anymore. tnx

then one more ting
2 mil of rows take around 15 minutes. and that is only one day.

is there any way to make this progress faster?
because this is only a small tag historian for us.
my script right now is:


def runAction(self, event):
	import csv
	
	# get file path of upload and then convert csv data to dataset
	csvPath = "C:/test historyFri Jun 16 09 02 22 CEST 2023/sqlt_data_1_20230611.csv"
	
	csvFile = open(csvPath, 'r')
	
		
	csvRead = csv.reader(csvFile)
	csvDataset = []
	count = 0
	for row in csvRead:
		print "test"
		csvDataset.append(row)
		count += 1
	
		if count >= 1000:
			for row in range(len(csvDataset)):
				tagid = csvDataset[row][0] 
				intvalue = csvDataset[row][1]
				if intvalue == "":
					intvalue = None  
				floatvalue = csvDataset[row][2]
				if floatvalue == "":
					floatvalue = None  
				stringvalue = csvDataset[row][3]
				if stringvalue == "":
					stringvalue = None  
				datevalue = csvDataset[row][4]
				if datevalue == "":
					datevalue = None  
				dataintegrity = csvDataset[row][5]
				t_stamp = csvDataset[row][6]
		
				# insert orders into sql table
				query = """use mijn_database INSERT INTO sqlt_data_1_2023_06 (tagid ,intvalue, floatvalue, stringvalue,datevalue,dataintegrity,t_stamp) 
						 VALUES (?,?,?,?,?,?,?)"""
				args = [tagid,intvalue, floatvalue, stringvalue,datevalue,dataintegrity,t_stamp]
				system.db.runPrepUpdate(query, args)
			csvDataset = []
			count = 0
	if count>0:
		for row in range(len(csvDataset)):
			tagid = csvDataset[row][0] 
			intvalue = csvDataset[row][1]
			if intvalue == "":
				intvalue = None  
			floatvalue = csvDataset[row][2]
			if floatvalue == "":
				floatvalue = None  
			stringvalue = csvDataset[row][3]
			if stringvalue == "":
				stringvalue = None  
			datevalue = csvDataset[row][4]
			if datevalue == "":
				datevalue = None  
			dataintegrity = csvDataset[row][5]
			t_stamp = csvDataset[row][6]
	
			# insert orders into sql table
			query = """use mijn_database INSERT INTO sqlt_data_1_2023_06 (tagid ,intvalue, floatvalue, stringvalue,datevalue,dataintegrity,t_stamp) 
					 VALUES (?,?,?,?,?,?,?)"""
			args = [tagid,intvalue, floatvalue, stringvalue,datevalue,dataintegrity,t_stamp]
			system.db.runPrepUpdate(query, args)

Insert multiple rows at once. But, as @pturmel said, it may only support a few thousand question marks. I modified your script, shown below. Start with 500 rows at a time, then raise or lower depending on how it runs.

def runAction(self, event):
	import csv

    def prepInsertQmarks(numOfGroups, groupSize):
	''' 
	    Generate a string of grouped question marks for prepUpdate inserts
	    example: Generate three groups of four question marks:
	    prepInsertQmarks(3, 4) returns (?,?,?,?),(?,?,?,?),(?,?,?,?)
	'''
		group = '(' + ','.join(['?'] * groupSize) + ')'
		return ','.join([group] * numOfGroups)
	
	# get file path of upload and then convert csv data to dataset
	csvPath = "C:/test historyFri Jun 16 09 02 22 CEST 2023/sqlt_data_1_20230611.csv"
	
	csvFile = open(csvPath, 'r')
	
		
	csvRead = csv.reader(csvFile)
	csvDataset = []
	count = 0
	query = """use mijn_database INSERT INTO sqlt_data_1_2023_06 (tagid ,intvalue, floatvalue, stringvalue,datevalue,dataintegrity,t_stamp) 
			   VALUES {}"""

	for row in csvRead:
		print "test"
        # using extend instead of append adds values as a flat list to use in runPrepUpdate() later
		csvDataset.extend(row)
		count += 1
		# Based on @pturmel's information, check for 500 rows instead of 1000.
		# Reduce the count threshold value if an error occurs
		if count >= 500:
			qmarks = prepInsertQmarks(count, len(row))
			# insert orders into sql table
			system.db.runPrepUpdate(query.format(qmarks), csvDataset)
			csvDataset = []
			count = 0
	if count>0:
		qmarks = prepInsertQmarks(count, len(row))
		# insert orders into sql table
		system.db.runPrepUpdate(query.format(qmarks), csvDataset)
2 Likes

i don't understand the function prepInsertQmarks
but if i try to run the code above it gives a error

wait it worked only thing to change was 500 = 200
otherwise the error was
caused by SQLServerException: The incoming request has too many parameters. The server supports a maximum of 2100 parameters. Reduce the number of parameters and resend the request.

tnx u so mutch

2 Likes

There, we now know for sure what the hard limit is. :slight_smile:

EDIT: So, for seven columns, you could bump it up to 300 rows. But I'd maybe put it to 280. Why tempt fate? :laughing:

1 Like

Varies by brand and version IIRC.

2 Likes

did try 250 did't work but witin 1 minute is fast enouf for me. tnx

1 Like

Wouldn't this be one of those cases where you would prefer system.db.runSFPrepUpdate instead of system.db.runPrepUpdate, or am I misunderstanding the purpose of that function?

I'd tink you'd still be limited by the number of parameters you can send at one time.