Perspective File Upload skipping Files

I'm using the File upload component to load 10,000 files into a database. I have found this works fine with multiple files in very small groups 1-5 files has never caused an issue. When I try to do larger groups of say 1000 files at once I will get gaps where files are just skipped and don't load in the database. If I try those same files that skipped in a small group it works.

Out of curiosity I added a sleep timer in the onFileReceived component event and found that made the problem much worse skipping even more files.

Has anyone else experienced a similar problem and what solutions were found. I do not want to break down my uploads into very small chunks if I don't have to.

https://docs.inductiveautomation.com/display/DOC81/Perspective+-+File+Upload

Name Description Property Type
maxUploads The maximum number of concurrent (simultaneous) uploads to allow. Default is 5. value: integer
supportedFileTypes An array of string values, indicating what file types are allowed to be uploaded. Example values are "pdf" or "txt". array
fileSizeLimit Specifies the maximum size of each uploaded file, in megabytes (MB). Default is 10 MB. value: integer

You are exeeding the default max settings,
You can probably make them a bit bigger, but you really never should allow 10k files to be uploaded through the webpage...

Compress them, what kinda files are you uploading?

I set the maxUploads to 10000 and made sure all file types and sizes matched those settings as well. I performed a test by changing those values and it rejects them as expected with an indication of this in the UI. The skipped files however show they loaded, but do not make it to the database.

I just ran another test by moving my files to cloud storage and dropping them into the file upload component from there. As I expected this slowed down the upload speed and in doing so the component is no longer skipping files. This is not a good solution, but proves there is some buffer or something that overflows causing me to lose data in upload.

anything in the logs?

What script are you using to insert the files into the database?

And presumably these aren't very large files... any reason you can't zip them before upload, then unpack the zip if you need to before inserting it?

Nothing in the logs that I have been able to find so far.

The files vary in size, but most are small. These are coming from a legacy system that I'm migrating to Ignition. Zipping the files is an option. Here is my script.

		try:
			fileMeta = event.file.name.split('==')
			DocumentId = fileMeta[0]
			ClaimId = fileMeta[1]
			fileName = fileMeta[2]
			document = event.file.getBytes()
			# Build parameter dictionary
			params = {'DocumentId':DocumentId, 'FileName':fileName, 'Document':document}
			# Run Insert Query
			system.db.runNamedQuery('Claim/insertDocumentImport', params)
		except:
			log.append({'ClaimId':ClaimId,'DocumentId':DocumentId,'ErrorClass':sys.exc_info()[0], 'ErrorMsg':traceback.format_exc()})

Try that, uploading that many files is never a good idea

I have tried zipping the files and found it always throws a heap error as it is just too much information to process at nearly 7GB. The solution I have found that works is when I upload the files and save them to the gateway using system.file.getTempFile I then go back later and load all the temp files into the database. For some reason saving the files to the gateway does not cause the issue that loading them into the database directly caused.

That said ideally if I was allowed to save files to the gateway or to a place my SQL server could see then I would be able to do this with a gateway script or TSQL and not deal with the web interface at all.

What zip tool is doing this? Java's native support for zip files can work with streams, only holding in memory what is actually needed per file (plus a little overhead).

I was using python ZipFile and I had 10k files in the zip.

Hmm. Is this something you are doing once? Or will it be a typical user operation?

I ask, because Perspective has to upload the whole batch before it will call the action script, so the heap will remain a problem, I think.

If just once, consider using the designer's ability to work with local files to push these into the database.

This is a one time load for data migration. How do I work with local files in the designer?

Using the script console in designer and the python os module I was able to successfully import a large quantity of files into a database. Files are stored where my local machine has access and the os module can be used to navigate directories and pull in the files.