I'm using the File upload component to load 10,000 files into a database. I have found this works fine with multiple files in very small groups 1-5 files has never caused an issue. When I try to do larger groups of say 1000 files at once I will get gaps where files are just skipped and don't load in the database. If I try those same files that skipped in a small group it works.
Out of curiosity I added a sleep timer in the onFileReceived component event and found that made the problem much worse skipping even more files.
Has anyone else experienced a similar problem and what solutions were found. I do not want to break down my uploads into very small chunks if I don't have to.
The maximum number of concurrent (simultaneous) uploads to allow. Default is 5.
value: integer
supportedFileTypes
An array of string values, indicating what file types are allowed to be uploaded. Example values are "pdf" or "txt".
array
fileSizeLimit
Specifies the maximum size of each uploaded file, in megabytes (MB). Default is 10 MB.
value: integer
You are exeeding the default max settings,
You can probably make them a bit bigger, but you really never should allow 10k files to be uploaded through the webpage...
Compress them, what kinda files are you uploading?
I set the maxUploads to 10000 and made sure all file types and sizes matched those settings as well. I performed a test by changing those values and it rejects them as expected with an indication of this in the UI. The skipped files however show they loaded, but do not make it to the database.
I just ran another test by moving my files to cloud storage and dropping them into the file upload component from there. As I expected this slowed down the upload speed and in doing so the component is no longer skipping files. This is not a good solution, but proves there is some buffer or something that overflows causing me to lose data in upload.
The files vary in size, but most are small. These are coming from a legacy system that I'm migrating to Ignition. Zipping the files is an option. Here is my script.
I have tried zipping the files and found it always throws a heap error as it is just too much information to process at nearly 7GB. The solution I have found that works is when I upload the files and save them to the gateway using system.file.getTempFile I then go back later and load all the temp files into the database. For some reason saving the files to the gateway does not cause the issue that loading them into the database directly caused.
That said ideally if I was allowed to save files to the gateway or to a place my SQL server could see then I would be able to do this with a gateway script or TSQL and not deal with the web interface at all.
What zip tool is doing this? Java's native support for zip files can work with streams, only holding in memory what is actually needed per file (plus a little overhead).
Using the script console in designer and the python os module I was able to successfully import a large quantity of files into a database. Files are stored where my local machine has access and the os module can be used to navigate directories and pull in the files.