Converting integers in dataset

I’m seeing something odd, and don’t know to best handle it.

I’ve got a python dictionary of values. This dictionary is initialized on startup by reading some values in the database. The datatype of one particular column is bigint, so if I print out the dictionary values, I might see numbers like 45L, 207865L, etc.

But, in my code I manipulate these numbers, so 45L may become 0 or 1. This was never an issue until I tried to convert the dictionary to a dataset and display the values on the screen. For instance, this code will fail:

import system
d = {}
d[1]={'a':1,'b':20L}
d[2]={'a':3,'b':1}
k = d.keys()
k.sort()

rows=[]

for i in k:
	rows.append([i,d[i]['a'],d[i]['b']])

print d[1]['b']+d[2]['b']

headers = ["Machine", "A","B"]
data = system.dataset.toDataSet(headers, rows)
print data

I can do math with the two integer types together, but can’t insert them into the same dataset if the long integer comes first in the sequence (I get a java.math.BigInteger conversion error), but it will work in this example if I remove the k.sort, because the datatype of the column is then set to INT, I assume. So, it looks like a BigInteger can be added to an Integer column, but an Integer cannot be added to a BigInteger column. Does this make sense?

Is there some way of defining the dataset with certain datatypes? Is this what you would expect? For now, I just multiply the column values by 1L to force the conversion.

Deja vu :smiling_imp:

Being able to explicitly define the datatypes of a new dataset isn’t a bad idea, but in the meantime I’ve tweaked the handling of the toDataSet() call so that it doesn’t use BigIntegers, but Longs instead. The fix is in 7.1.7.