"Cannot convert row x column y to type java.lang.Long"

So, I’ve seen threads similar to this, but this one I really do think is some kind of bug.

I have a dataset with rows/columns that, when printed via a logger given by system.util.getLogger, returns something like:

[ [5L, 3.6], [0L, 0L] ]

Or something similar, this is just for illustrative purposes.

So when I try to do system.dataset.toDataSet on this (with a header list provided, of course) it fails with

Unable to convert row x, column y to type class java.lang.Long

I can clearly tell it’s because the float/integers aren’t meshing properly.

However, no matter what I do, I cannot get the script to treat the integers in there as floats. Wrapping them in float() doesn’t work, wrapping them in java.lang.Float() doesn’t work, adding 0.0 doesn’t work, multiplying by 1.0 doesn’t work. Nothing.

The only option that has worked has been adding a small value to each number in order to force it to be a float, in my case 0.0001. This is, quite obviously, not ideal. And given how I cannot seem to get this to work normally, at all, no matter what I try, I feel like it has to be a bug.

What version are you on?

Testing this in the script console works for me:

old = [ [5L, 3.6], [0L, 0L] ]

new = [[float(subitem) for subitem in item] for item in old]

print system.dataset.toDataSet(['Col1', 'Col2'], new)
		

That works for me too in the script console.

I’m passing data in from a perspective component’s data property, since apparently that seems to matter?

This is taking place within a function that converts a list of dicts to a dataset, here’s that function.

def list_to_ds(in_list):
	"""
	Converts a list of dicts to an Ignition DataSet.
	
	This assumes that the first item in the list is a dict that contains all of the keys that will also be present in all other items.
	"""
	headers = in_list[0].keys()
	
	output_list = []
	
	for row in in_list:
		new_row = [ row[key] for key in headers ]
		output_list.append(new_row)
	
	logger = system.util.getLogger("test")
	logger.info(str(output_list))
	
	ds = system.dataset.toDataSet(headers, output_list)
	
	return ds
	

Whelp, I finally got it. I thought I had tried this earlier with a slightly different method, but now it worked.

def list_to_ds(in_list):
	"""
	Converts a list of dicts to an Ignition DataSet.
	
	This assumes that the first item in the list is a dict that contains all of the keys that will also be present in all other items.
	"""
	headers = in_list[0].keys()
	
	output_list = []
	
	for row in in_list:
		new_row = [ row[key] for key in headers ]
		converted_row = []
		
		for item in new_row:
			try:
				item = float(item)
			except:
				pass
				
			converted_row.append(item)
		
		output_list.append(converted_row)
		
	
	logger = system.util.getLogger("test")
	logger.info(str(output_list))
	
	ds = system.dataset.toDataSet(headers, output_list)
	
	return ds
	```
1 Like

interesting! :open_mouth: what input was causing it to fail?

Happy you got it though :smiley:

You could also use a com.inductiveautomation.ignition.common.util.DatasetBuilder to construct the dataset imperatively, with fixed types, rather than relying on the automatic coercion in system.dataset.toDataset.

1 Like

I wish I knew this existed like 2 years ago! I have code where i manually go through and try to cast every element in a list of lists to a float before calling system.dataset.toDataSet and if it fails that i pop elements from the end of the list until it passes the toDataSet call -_-