Learning to use the queries

query question

If I want all the gross count for this month from the second 8 hours in a day to go into a report, how would the query look?
If the tag has been resetting every day or every shift, is there a query that will work for this?
Or do I need to make and historize an odometer type counter?

I watched the named query in a report video.
Named Queries in Reports - Report Data (inductiveuniversity.com)

I don’t see how it looks though. Maybe I am missing it. I don’t see the report in the video.
I have a report where I put keys on the design and see the reports that way.
It is new to me.

Completely depends on how the data is stored in your database table. However, you will probably at some point be using a GROUP BY MONTH(yourDateColumnHere) or similar to be able to use aggregate functions like SUM() in the select clause of your query.

Here’s a decent SQL primer SQL Introduction

select *
from tablename
where hours >7 and hours <15

like this then?

And how do I put that in the report? I don’t understand how to show the query as data in the report.
I will watch the video I mentioned again.

I think my first problem is that when I historize, I don’t know where to then find that data.

You make it as a data source in the report and from there you will be able to reference the data on differnet report components.

That query will give you all records from any day of the week/month/year, as long as it is after 7 and before 15 hours. You can filter further by making sure the month and year are equal to some value, or the day, depending on how granular you need the info.

will my date range I set in the report work to curtail the range ?
or is there even more to do?
Like I need to setup a month range

Depending on what your data looks like you may need to use count() and for your report to use the current month you would need to parse that out with getdate(). All assuming you are using MS SQL server.

This will return all rows from the current month with the hour between 7 and 15:

where year(DATE) = year(GETDATE()) and month(DATE) = month(GETDATE()) and datepart(HOUR, DATE) >7 and datepart(HOUR, DATE) < 15

You probably want your month range as a parameter to be fed in, with the default value of the current month in the expression. You can then reference that in your query.

thanks for this

I don’t know where the tag history is storing the values, so for the table name, I am not sure.

Then, I am not sure how to pass the parameter.
I am looking at this video as I see a parmeter is being passed.
Tag Historian Query - Report Data (inductiveuniversity.com)

I don’t know much about historian, but it partitions the tables by default, so they would change over time. You can disable that, but i don’t know if thats a bad idea or not. I have a database that logs infrequent tag changes that has the partitioning disabled which looks like this:
*There’s actually more tables, but this one contains the time stamp.

I need to ask more basic questions.
Tag historian query or named query?

I need to use where conditions
The data is from a historized tag

Is there a way to append the tag history query in the report section?
I don’t see it.

If I need to use the named query, it is partitioning each month I think. How do I overcome that?

I don’t quite follow all of your questions.
If your tag is historized with the default settings, the tables will be partitioned.

You don’t need to use a named query for a report.

You would need to create a query data source on your report for this information, and then use the info from the query on your report.

If you want to disable partitioning:

I don’t like posting screenshots myself because I have to remove some stuff to make it generic

  FROM  sqlth_partitions 

I tried all the partitions I see to find these historized tags.

However, it is like:


So I have no idea how I am to find the gross count variable in that

partitioning will have to remain enabled I think
I don’t have permission to change it or access to config page anyway

I see where to pass the parameter into the query now though.
I just put in the question mark in and do it like
SQL Query - Report Data (inductiveuniversity.com)

just now I have startdate and enddate to pass, and have to figure out how to compare those with data in my table, which as above, I don’t know which date to look at for this historized tag

Hopefully someone who knows more about Historian will chime in, since it doesn’t sound like a simple query will work for you. Also the solution is going to depend on what your data looks like, whether you want a count of tag change events, or maybe you are already recording a running total, etc.

Yah, I was wondering if the query will do so much heavy lifting for me that I don’t need to make new tags that are sanitized or formatted like an odometer. My tags are resetting in the plc daily or each 8 hours depending on the machine right now.

I think this database location obstacle is the critical part for me though.
I think once I know where to go to get the data, I will have a much easier time to learn where to massage it and how.

well it seems like if my tags were made up in a sanitized way, like if I made tags with a script

`if between 700 and 1500 hours, then add the value to this gross shift 2 tag’

Then I could just straight use the tag historian max. The startdate and enddate would define the report.

Then I just wonder, how to address the case of a roll over. Like then the max will miss some amount of the rolled over value.

You may still be able to use the historian as you have it now, but first a few questions:

  1. What are your tag values showing?
  2. Do these increment by 1 each time something is counted? Therefore, in your history, the latest recorded value of the tag between times 700-1500 would be the gross count you’re after?
  3. When are these tag values reset?

However, what makes querying manually from the historian more difficult, as already mentioned in here, is the fact that the historian uses partitioning (and you shouldn’t turn this off for 99% of cases - partitioning is there so that your query times are short. If no partitioning, then your data table will likely grow significantly and significantly increase the time to query data from it).

You would need to use some logic, say in a script, to build a list of tables to query from. The partition table (sqlth_partitions) will tell you which tables apply to what date ranges, so you could query this to get the list of tables you need to query and then run the same query on each of the relevant tables.

But first, questions above.

  1. gross count of product made is the one I am working with at the moment, and the value resets per day or per shift
    eventually I will be looking at the net as well and need to do some math to get the scrap and efficiency

  2. they increment by different amounts, but the count is accurate; sometimes the machine is run at different speeds and the interval that the data is collected from plc to ignition is like 1 in 10 seconds I think
    2b. yes the last value not 0 would be the value desired before a reset, but I also have get month long data

  3. resets are each 8 hours or each 24 hours depending on which machine

Do you have a way for me to query the right partition always? I think the partitioning is just like the year and month. I am not sure how I would write code to adjust for the different start and end dates.

If I use the tag historian with sanitized tags, it is more work for the gateway to process the tag into a tag for each shift for each of the different tags. I think the data is easy to make reliable though.

Just FYI, this is the historian table structure:
Ignition Historian Tables - dbdiagram.io

From what you’ve said, you can use the tag historian, however if it were me, I would be creating a new table in SQL to store this data. I would be inserting a row every time a new part is counted so that you would end up with a table that holds the timestamps (in unix epoch ms) of each part that has been processed.

E.g. the data might look like this:

id     timestamp       partNo    tagPath
1      1631739011000   120394    path/to/tag
2      1631739722000   100334    path/to/another/tag

You should really create at least two tables, one that stores the data and one that stores the tags, same as how the Ignition historian tables are defined where the data table links the tags by their ids.

Timestamps should be stored in unix epoch (i.e. UTC time as a bigint) to avoid DST and to make sure querying across different timezones doesn’t affect the results. You can easily get this from script using:

currentValue.timestamp.getTime() # converts tag's timestamp into unix epoch in ms

The id column should be auto-incrementing

To do all of this, I would use tag change scripts attached to each of your tags to monitor for changes, and if non-0, then insert a row into the data table. Make sure that you’re not inserting if initialChange is on in the change event. Then your query becomes much simpler.

   CONVERT(varchar, dbo.fn_ConvertToDateTime(timestamp, 1), 112) as yyyymmdd,
   count(*) as parts_count
   dbo.fn_ConvertToDateTime(timestamp, 1) BETWEEN DATEADD(Month, -1, GETDATE()) AND GETDATE()
   DATEPART(hour, dbo.fn_ConvertToDateTime(timestamp, 1)) BETWEEN 7 AND 15
   CONVERT(varchar, dbo.fn_ConvertToDateTime(timestamp, 1), 112) -- converts ts to yyyymmdd

where you’ll need to add this function into your database to convert a unix timestamp into a local datetime:

CREATE FUNCTION dbo.fn_ConvertToDateTime (@Datetime BIGINT)
    DECLARE @LocalTimeOffset BIGINT
           ,@AdjustedLocalDatetime BIGINT;
    SET @LocalTimeOffset = DATEDIFF(second,GETDATE(),GETUTCDATE())
    SET @AdjustedLocalDatetime = @Datetime - @LocalTimeOffset
    RETURN (SELECT DATEADD(ms,@AdjustedLocalDatetime%60000, DATEADD(minute,@AdjustedLocalDatetime/60000, CAST('1970-01-01 00:00:00' AS datetime))))


CREATE FUNCTION [dbo].[fn_ConvertToDateTime] (@Datetime BIGINT, @ConvertToLocal bit)
    DECLARE @LocalTimeOffset BIGINT
           ,@AdjustedLocalDatetimeSec BIGINT
		   ,@Milliseconds BIGINT;
    SET @LocalTimeOffset = DATEDIFF(second,GETDATE(),GETUTCDATE())

	SET @AdjustedLocalDatetimeSec = @Datetime / 1000 -- Convert to seconds
	SET @Milliseconds = @Datetime % 1000

    IF @ConvertToLocal = 1
		SET @AdjustedLocalDatetimeSec = @AdjustedLocalDatetimeSec - @LocalTimeOffset;
    RETURN (SELECT DATEADD(millisecond, @Milliseconds, DATEADD(second,@AdjustedLocalDatetimeSec, CAST('1970-01-01 00:00:00.000000' AS datetime2(7)))))

**All completely untested
Also assuming you’re using T-SQL (MS SQL Server)

1 Like

that is dauntingly high over my head to understand at the moment

I will read up on that historian tables link in the morning.

I think I caught the part about the data and the epoch time. Thanks for explaining.

I am lost when I read the rest. I will try again in the morning. Thank you for helping me.

I don’t know anything about Ignition’s Reporting module (I assume this is what you are referring to when you say generate a report), but if the tag is being historically recorded using the Tag Historian, couldn’t you just use the system.tag.queryTagHistory scripting function to achieve the same thing? This function handles all the partitioning and everything, even if you change the configuration on the historian partitioning. If it spans multiple time slices, maybe even using system.tag.queryTagCalculations will help too.

I’ve not used those scripting function much, but some components can bind to tag history as well (which is essentially calling these functions). Maybe you create a tag that returns a dataset using this capability, then you just bind the tag to the data table in the report?