system.tag.queryTagHistory: losing data when the intervalMinutes is set higher

Hello, First time posting so please let me know if I am doing anything wrong.

I am trying to pull the history of a tag with intermittent data. There will be several seconds of readings; however, most of the time the reading is 0.

When I use the queryTagHistory function it will not include these spikes if I set the intervalMinutes too high (above 0.4). Does the function not sample all of the datapoints in the range to determine the maximum as the interval increases?

Example:
Returns the data correctly but with over 1,000,000 rows
system.tag.queryTagHistory(paths=tagPaths, columnNames = tagTitles, startDate=startTime, endDate=endTime, aggregationMode=“Maximum”, intervalMinutes=0.4, returnFormat=‘Wide’)

Returns the data incorrectly. All rows returning 0
system.tag.queryTagHistory(paths=tagPaths, columnNames = tagTitles, startDate=startTime, endDate=endTime, aggregationMode=“Maximum”, intervalMinutes=10, returnFormat=‘Wide’)

It looks strange to me that you are seeing all the rows return 0 at intervalMinutes=10. Given that you have a maximum that is higher than 0 with a smaller interval, I would expect that you would still see a value higher than 0 at a larger interval. What is the expected data that you would want to see at intervalMinutes=10 and what is the data that you are getting at intervalMinutes=0.4? Could you share a screenshot of the results from both case? What are the startDate and endDates that you are using for both case?