[BUG-15901] JSON encoding and decoding a "Dataset" datatype property

I am trying to pass a value of “Dataset” type as a value in a dictionary which I need to store in a “String” format. Assume the following:

dataset: variable of type “Dataset”
dict = {‘set’ : dataset}

To build the string representation for the dictionary, I am using system.util.jsonEncode. When I try to retrieve the dictionary from its string representation using system.util.jsonDecode, I do not get the dataset back in “Dataset” type. Instead, I get a representation of it in some form.

Is there a way to get back the dataset from its representation? If not, is there any other way to achieve the following: Dataset --> Dictionary --> String --> Dictionary --> Dataset?

I was able to do so with a few extra steps. It basically goes Dataset -> Encoded dataset -> Dictionary -> String -> Dictionary -> Encoded dataset -> Dataset. In the example here, I am reading a dataset tag, but it should work with any other dataset given you first convert it to a pydataset.

	#Original Dataset
	dataset = system.tag.readBlocking(['asdf'])[0].value
	#Convert to Pydata
	pydata = system.dataset.toPyDataSet(dataset)
	#Encode JUST the dataset
	encoded_dataset = system.util.jsonEncode(pydata)
	#Create the Dict
	some_dict = {'the_data':encoded_dataset}
	#Encode the Dict with the encoded data inside
	encoded_dict = system.util.jsonEncode(some_dict)
	#Decode the dictionary
	decoded_dict = system.util.jsonDecode(encoded_dict)
	#Decode the inner dataset
	self.custom.dataset = system.util.jsonDecode(decoded_dict.get('the_data'))

All this said, this should probably be improved on our end.

You can import and use TypeUtilities; something like this should work:

from com.inductiveautomation.ignition.common import TypeUtilities
jsonStr = str(TypeUtilities.datasetToJSON(<your dataset>))
ds = TypeUtilities.datasetFromJSON(jsonStr)
1 Like

Thanks @osolorzano. That worked but I agree with you, an improvement here can be quite helpful.

@PGriffith
Thanks for your response but the solution did not work. When trying to get the dataset from JSON string, it errors out as the json string could not be coerced to JSON.

The reason Paul's instruction threw an error is the decode command should be:

ds = TypeUtilities.toDataset(jsonStr)

The advantage of using TypeUtilities is that you can pass in a "real" dataset which returns a "real" dataset instead of a pythonDataset.

how about this? create a json dict in a project library script to be call by the ‘runScript()’ expression function then to use the derived tag to push the json to a dataset memory tag? using jsonGet and Set?

@PGriffith Paul, I am using:

str(TypeUtilities.datasetToJSON(dataset))

to convert a dataset to a string for messaging. This works fine until the message contains unusual characters. For instance, if I try to send the Swedish word “kalenderår” it gets corrupted to “kalenderÃ¥r”.
Any ideas?

str is collapsing to a lower character encoding. Try using unicode() around the TypeUtilities call.

Thank you. Thank you. Thank you. I love easy fixes.

1 Like

I’m tying to use the TypeUtils library to encode and decode datasets because i’d like to store the datasets in SQL.

I’ve noticed that using TypeUtils will cause me to lose milliseconds on my datetime data in my datasets, as you can see below:

it looks like the milliseconds are stripped out during conversion from dataset to JSON string.
See below the first two rows from the dataset, in JSON format:

[[-1.7109261751174927,3.7401225566864014,-0.0666724145412445,3.195615530014038,"Fri Mar 18 16:28:18 NZDT 2022"],

[-1.6445746421813965,3.6924030780792236,-0.05308615043759346,3.193087339401245,"Fri Mar 18 16:28:18 NZDT 2022"]`

What is determining how datetime objects should be represented as strings? I’m sure milliseconds can be recorded, just not sure if I have power with these particular functions to change the behaviour.

Managed to get it working with some hacking of a custom jsonEncodeDataset function i wrote.
Any constructive critisicsm of how to better extract the Java “type” for columns would be much appreciated.

	ds = self.getSibling("Table RawData").props.data
	pyDS = system.dataset.toPyDataSet(ds)
	
	# Create column keyValues
	# Example of format required for column list:
	#// columns = [{"name":"AIN0","type":"java.lang.Double"},{"name":"AIN1","type":"java.lang.Double"},{"name":"AIN2","type":"java.lang.Double"},{"name":"AIN3","type":"java.lang.Double"},{"name":"t_stamp","type":"java.util.Date"}]
	cols = []
	for c in range(ds.getColumnCount()):
		# NOTE: Type is returned as "class java.lang.Double", but when cast to a string becomes: <type 'java.lang.Double'>
		type = str(ds.getColumnType(c)).replace("<type '","")
		type = type.replace("'>","")
		cols.append({"name":ds.getColumnName(c), "type":type})
	# end for
	
	# Create row keyValues
	newRows = []
	for row in pyDS:
		# Couldn't decode using TypeUtils. Maybe due to Timezone? 
		###dateStr = system.date.format(row["t_stamp"], "yyyy-MM-dd HH:mm:ss.SSS z")
		dateStr = system.date.format(row["t_stamp"], "yyyy-MM-dd HH:mm:ss.SSS")
		newRows.append([row["AIN0"], row["AIN1"], row["AIN2"], row["AIN3"], dateStr])
	# end for
	
	newDict = {"columns":cols, "rows":newRows}
	encoded_dict = system.util.jsonEncode(newDict)
	
	self.parent.custom.jsonDataStr = encoded_dict

EDIT: I realize i have hard coded the extraction of data with the newRows.append( line. I’ll sort that out but you get the gist of it.

For type, use ds.getColumnType(c).getCanonicalName(); canonical name is an attribute of java.lang.Class: Class (Java SE 11 & JDK 11 )

As for your precision loss question: Yes, unfortunately the way TypeUtilities ends up encoding to JSON uses a method that has no awareness of Date objects, and thus is just calling their default toString() implementation, which has a hardcoded pattern:
https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/util/Date.html#toString()

Safer than relying on any string representation would probably be hardcoding Dates to store their time as epoch millis; the existing TypeUtilities method should still be able to reconstruct that into a dataset with the right type.

1 Like