Data Source Script Error

this is the data source script I have (quite trivial, actually):

   newDataset = []
   rawDataset = data['ds_query'].getCoreResults()
   for row in range (rawDataset.rowCount):
		lotVal = rawDataset.getValueAt(row,'LOT')
	    outputVal = rawDataset.getValueAt(row,'Output')
    nextDataset = system.dataset.toDataSet(header,newDataset)
    data['DataSource'] = nextDataset 

It gives me all the time this error:“unindent does not match an outer indentation level”, can’t figure out why. Any ideas?

Python is a whitespace-sensitive programming language; instead of curly braces { to denote blocks of code, Python uses indentation. Indentation can be either tabs or spaces, but must be consistent.
So in your example, the for statement is opening a block - there must be at least one line indented some amount past it (generally recommended style is 4 spaces; Ignition’s code editors use 1 tab character for legacy reasons). Any statements at the same indentation level as the first line will be looped over in the ‘for’ block. Anything at the same indentation level as the start of the for block will be executed after the for block, because it’s then closed.

Moving first and next 2 lines of for block 4 spaces fixed the problem. Thanks, Paul!

And then I got this:

WARN: Error invoking script.Traceback (most recent call last):
File “function:updateData”, line 9, in updateData
at com.inductiveautomation.ignition.common.AbstractDataset.getColumnIndex(
at com.inductiveautomation.ignition.common.AbstractDataset.getValueAt(
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.base/java.lang.reflect.Method.invoke(Unknown Source)
java.lang.ArrayIndexOutOfBoundsException: java.lang.ArrayIndexOutOfBoundsException: Column ‘LOT’ doesn’t exist in this dataset.

Column names are case sensitive. Something to look at in your dataset.

Thank you, Jordan, indeed, wrong column name.