system.tag.queryTagHistory returns None for values and -2147483129 for quality in gateway scope but not script console

Version: 8.1.17 (b2022051210)

This is intended to return tag history from a web endpoint using the web development module. I have pared things down to a minimal version (below) and it seems that the only difference between whether this works or not is the scope where the script is running. If I run this minimized version in a script console it works returning the values and qualities as expected. When called from a web dev endpoint or in a gateway event tag change script (for testing) it will return the rows but the value and quality columns will all have None and -2147483129 respectively.

-2147483129 looks like an overflow for a signed long. The difference would put it at 518, maybe? Which would be Bad_LicenseExceeded, 517 and 519 would be either an expired trial license or 'object not found'. I tried with the trial restarted (it's normally not active, we have a license) and still got Nones. The idea that the gateway couldn't find the tag history while the script console could doesn't make sense to me. So, maybe this is not a useful direction. Though none values might make some sense then.

For some extra oddness, this had been working on the web dev endpoint. But, I've traced it back to this.

Does anyone have any ideas why I'm seeing this behavior or how to rectify it?

def minimal_web_query(request, session=None):
	import json
	
	lg = system.util.getLogger('tagh_endpoint')
	lg.info('logger started')
	
	# rows as dictionaries in a list
	jsonRows = []

	# this data is static to compare scope behavior; other tags have been tried
	data = json.loads(u'{"endDate": "2023-01-02 00:00:00", "intervalMinutes": 15, "paths": ["OXIDIZER1/Oxidizer chambers average temperature"], "returnFormat": "Tall", "startDate": "2023-01-01 00:00:00", "aggregationMode": "Average"}')
	lg.info('data (params) as received: ' + repr(data))
	
	th_dset = system.tag.queryTagHistory(**data)  # the tag history dataset
	first_row = [col for col in system.dataset.toPyDataSet(th_dset)[0]]

	# this shows the None, -2147483129 already if this is called from a gateway scope
	lg.info('first row of th_dset: {}'.format(first_row)) 
	
	# the rest is what is left of the web dev endpoint having torn all the checks and error handling
	log_rows = 5  # only log a few rows below to prevent spam
	for row in range(th_dset.getRowCount()):
		jsonRow = {}
		for col in th_dset.getColumnNames():
			if col == "t_stamp" or col == "timestamp" or col == "time":  # datetime objects don't play nicely with json
				if 'useMillis' in data and data['useMillis']:
					jsonRow[col] = th_dset.getValueAt(row, col).toInstant().toEpochMilli()
				else:
					jsonRow[col] = th_dset.getValueAt(row, col).toInstant()
			else:	
				jsonRow[col] = th_dset.getValueAt(row, col)
				if log_rows:  # only show a few rows
					lg.info('jsonRow[col] = th_dset.getValueAt({}, {}) set to: {}'.format(row, col, repr(jsonRow[col])))
					log_rows -= 1
		jsonRows.append(jsonRow)
	lg.info(repr(jsonRows[:5]))
	return {'json':jsonRows}