This is what I currently have to filter for a "reason"(yellow circle), but I cannot seem to figure out how to take the sum of these filtered values to use for a chart (red circle). I have tried to Coalesce, but I am either doing it wrong or it will not work for a Named Query.
Any insight into this would be greatly appreciated for I am new to scripting.
Even running this code, I believe I am still running into the same problem with SUM(). If I understand it correctly, If you try to obtain the sum of a column that has a null value, the answer will always be null. This is the issue I am running into, but i cannot find a way to nix the null out of the SUM() equation so I can obtain the sum of the remaining numbers.
When I try to run the script I only get null in return, but if I take the SUM() expression out then I get my filtered list when I run the script. What am I missing?
Show your actual SQL. Keep in mind that the output column list of a query that includes grouping typically must only consist of aggregate functions, except for the grouping columns/keys themselves.
I tried an iteration of the aforementioned code. It seems to be doing what I want currently, but I may have to get some advice when I try to filter by dates to narrow the list. I do appreciate all of the feedback thus far.
Ok, so I have gotten previous kinks out of the process I was trying to execute before, but I have run into another problem. I have been able to filter dates and add the column, but now I cannot figure out how to Filter the addition where it only adds rows with certain reason codes. When I execute the code, it gives me the total for all rows between the filtered dates. Here is my code:
I am trying to figure out how to only include the rows that say 'Electrical' and have them be calculated independently from the rest of the rows between the filtered dates.
I know the data type for the "Reason" column (column with Electrical, Mechanical, Raw Material, and Operational) is an 'nvarchar,' so I am not completely sure if my 'WHERE' clause is filtering properly.
Sorry about that, I will keep it in mind for the future. I tried it but now I am getting an error.
SELECT Reason, SUM([DTMIN])
FROM vwIG_DownTimeDetailsByPID
WHERE Reason = 'Electrical'
AND (
([Trans Date] BETWEEN '2024-01-02'AND '2024-01-09')
OR ([Trans Date] BETWEEN '2024-01-02' AND '2024-01-09')
OR ([Trans Date] <= '2024-01-02' AND [Trans Date] >= '2024-01-09')
)
Error:
com.microsoft.sqlserver.jdbc.SQLServerException: Column 'vwIG_DownTimeDetailsByPID.REASON' is invalid in the select list because it is not contained in either an aggregate function or the GROUP BY clause.
There was an example of the GROUP BY in my original post.
SELECT Reason, SUM([DTMIN])
FROM vwIG_DownTimeDetailsByPID
WHERE Reason = 'Electrical'
AND (
([Trans Date] BETWEEN '2024-01-02'AND '2024-01-09')
OR ([Trans Date] BETWEEN '2024-01-02' AND '2024-01-09')
OR ([Trans Date] <= '2024-01-02' AND [Trans Date] >= '2024-01-09')
)
GROUP BY Reason
When I found this part of the code it was ranging between two separate columns instead of one like I am. I just copy pasted it and changed the params to see if it would work not realizing that there would be a duplicate. I'm not sure why but as soon as I got rid of the duplicate, the code was happy.
Ew. Or not. Between includes the endpoint value. Half-open intervals allow for fractional time periods at an endpoint without specifying the trailing zeros to the full column precision. For example, this:
... WHERE t_stamp BETWEEN '2024-01-01T00:00:00' AND '2024-01-01T23:59:59'
will miss any data points that have non-zero milliseconds in the last second. Using this:
... WHERE t_stamp >= '2024-01-01' AND t_stamp < '2024-01-02'
will get every value with a timestamp on January 1st, no matter the precision of the t_stamp column. Note that the start is checked with greater-than-or-equals but the end is checked with less-than. (Half-open means one end of an interval includes equal.)
That Ignition's historian doesn't require half-open intervals for its queries is one of its most annoying flaws, fwiw. (It also means that its API is stuck on milliseconds, as any attempt to switch to something more precise will break all current users.)