CSV File Read Stops Updating

I have been running a script in Ignition on a 15 second timer which opens a CSV file created by another application and reads the information currently stored to be displayed on a window I have created. The information is stored by the other application on a 5 minute timer, replacing the file each time it updates. (The 15 second timer is for me to see the changes on the screen without a long wait during debug. I don’t expect to run this much faster than 1 minute because it only has to keep up with periodic changes made by the production floor supervisors, but display their information in a timely basis. I set the other application rate up to 5 minutes for now because it interrupts my screen with a cmd prompt which is annoying at 5 minutes and more so at a higher rate!)

Most everything seems to run in my test environment except that periodically the window stops updating even though I know the data is new and can clearly see there was a change in the CSV file (preview pane in explorer). All the previous data stays on the screen like it never sees a different file.

I added print statements to the Console and can follow the progression of the script. The script repeats on the 15 second cycle, but the data I am reading (size, number of rows and information) does not match the CSV file but matches the old CSV file that doesn’t exist anymore.

Any suggestions?

Are you making sure that every time you read from the csv file you are then closing it? Your script may possibly be keeping the old copy open and using that data repeatedly instead of pulling the new data each time

For the screen interruption issue, instead of using cmd, you might consider using
the glob library: import glob
the os library: import os
Note: The glob library is UNIX style, but it works great on various Windows and Window Server OS's

	# import libraries
	import glob			# used to get directory and file name info from the OS
	import os			# used to manipulate OS files and directories, etc.

For the problem of:

the window stops updating

File reads and writes can be locked by the OS if more than one user/application is performing file functions at the same time. With different timing from the two applications, it's possible that you are on occasion being stopped by the OS.

Whenever I've worked with this type situation where multiple applications are doing read/write/replace file operations, I've found it best if the application that creates or replaces the file, renames or copies the old file before writing the new replacement file. If the rename fails it's because the file is currently being manipulated by the other application. If the application reading the file can't find it, try again every 500ms until it's available or you hit a maximum retry count that you set for yourself.That prevents a timing issue where you are never trying to read or manipulate a file while the another application is manipulating the same file.

Also, be sure to open the file in ReadOnly mode unless you actually need to write to it. That cuts down on OS file locks.

If the script is running as a Gateway timer script, ensure the threading is set to dedicated not shared.

I had a script that would run once per day and was intermittent, changing to dedicated threading ensured it ran each time it was supposed to

That’s exactly what I was thinking, but I was unable to find the correct code to close the file.

I wish I could use something besides cmd, but the company producing the software insists there is nothing they can do about it. I have one query running once a day producing a csv file and that wouldn’t be a problem. The other is running much more frequently.

I thought of renaming the file as well, but once again my application software doesn’t allow me that flexibility.

I will look into opening the file in ReadOnly mode. I have more control of the Ignition end.

I looked at that and wasn’t sure if it would do anything to help. I will try that.

My code if it helps:

import csv
import datetime
import os
Today = datetime.datetime.now()
print Today
TopTenList = 0
#CSV from Asset Essentials Connector Tool - File name and location
path = "\\\\MTSWVIR2\\Data\\PLT_ENG\\PLT_ENG\\Plant and Equipment\\AssetEssentials\\WorkOrdersFromAssets.csv"
#Establish that records exist in file
RowsExist = True
print path
print os.stat(path).st_size
if os.stat(path).st_size == 0: 
	RowsExist = False	#file size o f 0 indicates no records
if RowsExist:	#There are records in the CSV
	# Create a reader object that will iterate over the lines of a CSV.
	# Using Python's built-in open() function to open the file. 
	csvData = csv.reader(open(path))
	# Creating a List of strings to use as a header for the dataset. 
	header = csvData.next()
	# Create a dataset with the header and the rest of our CSV.
	print header
	WODataSet = system.dataset.toDataSet(header ,list(csvData))
	# Using the range function,  a range of values that represents the number of columns.
	RowsInCSV = WODataSet.getRowCount()

You need to open the file separately from the reader.

if RowsExist:
	fileIn = open(path)
	csvData = csv.reader(fileIn)
	do_some_stuff()
	fileIn.close()
2 Likes

Thanks! I wasn’t splitting it out like that and got syntax errors when I tried to close file. This worked as far as no errors, but didn’t resolve my issue. This script is still loading old file data. There should only be two rows in the CSV. My debug statements are showing 25.

File seemed to be stuck up open by Windows. I had to restart all programs and delete the CSV file to reset the data read. I will continue to test with the file close in the script and see if that prevents this problem in the future. Thanks!