jsonDecode bug?

Concur. The DatasetBuilder is far more robust.

Thanks everyone.
So any suggestions on a fast reliable way to do this across languages?
Just need to store the t_stamp,temperature in a column for future use.

The test is frequent and will result in millions of rows if I go the traditional route with a sub table.

Do what Paul described? :man_shrugging:

Can't you convert the timestamp to something like a UTC timestamp so that the data you're trying to encode/decode doesn't depend on locale?

I can't help but feel the issue has something to do with the decode process trying to interpret the (English formatted) date string with a Spanish locale and instead of throwing an error just falling back to returning the string.

I'll say this for posterity:
If you only have to store two values, and you know what they are and their datatypes, JSON is a pretty poor choice. A Java date is a 64 bit int, and a Java double is a 64 bit IEEE-754 floating point value. So you could store them completely losslessly in a fixed 128 bits/16 bytes, by definition.

Even an "efficient" JSON encoding is going to use a variable length UTF-8 string for the timestamp and value, and is going to use 50-60 bytes (no less than one byte per character):
{"value":"32.175926208496094","t_stamp":"1741966360000"}
A JSON schema that can reversibly encode any Ignition dataset is going to be much less efficient than that.

Unless you actually need the flexibility to store an arbitrary dataset (with arbitrarily typed columns) I would hesitate to go down that road. The more constraints you can put on what you expect to go into this "extra" column, the better.

Note that this is all moot if you truly do need the flexibility of arbitrary JSON storage. I'd just question whether that's truly necessary or not.

2 Likes

Bwah hah ha ha! Giggle, snort! :rofl:

If size is really critical, and you only ever need this data in Ignition (no searching in the DB), I would use a byte array/blob column and org.apache.commons.lang3.SerializationUtils, possible also with Gzip stream.

1 Like

I wish I could just blob it and use it only in Vision.
But this data will be pulled and used elsewhere as well, hence the reason I chose json as a universal option.

I will refactor it to be a more friendly and correct json format.
Thanks!