Any limitations on string or dictionary size in Ignition?

Are there any limitations to the size of strings or dictionaries within the Python scripting? I've got a script that I use to parse the contents of txt file to get parameters. Each line in my txt file is associated with the serial number for the item. I load the txt as a string using the system.file.readFilAsString. I then do a split on the returned string on the newline character '\n'. I then printed the length of the array and it returns 3185, which is the number of lines in this txt file.
Next, I iterate through each line, parse out the information and create a Python dictionary, which gets passed back to the loop to update a dictionary that will be written to a memory tag of type document. If I pare down the file to 25 lines, the code works just fine and writes to the tag, but when I run it on the whole file it doesn't (no error is thrown though). When I put a print statement into the for loop to troubleshoot, it looks like I get roughly 1800 lines through before it stops.
I'm testing this loop out in the Script Console before creating a function in the scripting library. Are there limitations to the Script Console?

What's your designer's Max memory set to?

Nevermind, I believe I've found the issue. When splitting the file, it looks like there is one 'line' of 0 length. Some of the scripting is looking for the character at a given index. Every so often, I'd get an index error thrown, but not always. If I cast the line as a string and make sure that it has sufficient length, the code works just fine.

In answer to the posed question: no, there's no explicit limits set, really. I would expect to run out of memory long before you hit limits like "the maximum index of a Java list is ~2,147,000,000".

You might, however, find some limit on the size of a string, dataset, or document that can be written and stored in a memory tag.


When i once selected a thounsand+ lines in my table and send that data through a message to a popup it didnt work because it was to much data.