Hourly and Monthly Reports

OK, I’m a little stumped on how to accomplish this easily. I have two reports built in my project for flow totals.

The daily report need to have a row populate for each hour of the day at the top of the hour showing a snapshot of the tag values. At the bottom of the report, it needs to calculate the total of each column that day. I have the report built, and have it set to add the total at the bottom, but I can’t figure out an elegant way to build a query to populate the hourly readings.

Second, I have a monthly report where I need to populate a row each day with the total of the readings for each day. I would assume this could also be done with just a query?

I have tried this using tag history to fill in, and I also set up a transaction group, but neither worked quite right (missed readings, duplicate readings, late readings, etc.)

Any SQL Query gurus want to help a poor soul? :prayer:

–BUMP–

In the hopes that someone has some ideas. Thanks.

To make sure I understand, you need to have a snapshot value every hour, sum those 24 values for one day on one report, and then display each days totals for a month in another?

For each of these reports, are you currently logging data every second, minute, etc, or are you starting from scratch? I’m not sure if you are just looking for a query to roll up existing data that is sampled many times in an hour, or need help getting a transaction group going to ONLY grab this information every hour, then roll it up.

Dan

Dan,

That is correct. I’m currently logging values every 5 seconds. I need to grab the first value logged after the hour starts (may be 12:00:00, may be 12:00:02, etc…) and drop those in the report, summing values at the end of the report. The seconds monthly report needs to take the sum from each day, drop it in a line, and put the sum of each column at the bottom. I’ve got the report structure set up without any issues, I just haven’t been able to get the query to work correctly, due to the fact that the first sample each hour is not going to be on the same second every time. If I can get the hourly/monthly samples to query and drop to a table correctly, I think I can get it from there. Thanks.

Joe

Joe,

I think the easiest way to do this is to write a query that gets a list of minimum timestamps per hour, then join that to the original table to only get the rows you care about. Try something like this:

select * from history_table a, (select min(t_stamp) t_stamp from history_table /*where clause*/ group by to_char(t_stamp, 'MMDDYYYYHH')) b where a.t_stamp = b.t_stamp and /*same where clause conditions as above on history table (a) */]

That is Oracle SQL, so if you are using MySQL, for example, you will need to use DATE_FORMAT instead of TO_CHAR. The formats for the date mask can be found here:

http://dev.mysql.com/doc/refman/5.1/en/date-and-time-functions.html#function_date-format

I don’t use MySQL very often, but it will probably be something like
DATE_FORMAT(t_stamp,’%Y %c %d %H’)

If you want the first row of a minute, day, month, etc, just change what you group by in the derived table (query in the from clause). Let me know if this works for you.

Dan

Dan,

It looks like that did the trick! Thanks for the push in the right direction.

Joe

Another easy way to accomplish this would be to have a transaction group simply store these snapshots on the top of the hour, every hour. That way you’d have a table that needed no extra massaging - it would have the data exactly as you needed it.

To do that, you’d make an expression item that used the expression language to extract the hour out of the current time ( now() ).

Then you’d have the group run every minute with that item as the trigger, with the trigger set to fire on any change.

Thanks, Carl. I’ll look at that also.

OK, slight monkey wrench. I was going over some things with our customer, and it’s not as straightforward as i had thought. The values being pulled for the report are flow totals. This is pulled from a totalizer register in the PLC. This totalizer register is not reset. However, on the report they want the value at midnight to be zero, 1 AM to be the difference between midnight and 1AM, 2AM to be the difference between 1AM and 2AM, etc… Right now, I’m just pulling the logged totalizer value for each hour. I sure do love feature creep and vague specifications. Any ideas for this one?

That definitely gets a little trickier. Doing that with SQL could get pretty hairy, so I’d probably build off of Carl’s suggestion. I’m thinking something along the lines of:

-create a new DB tag (LastFlowTotal), and create an OPC/Group item in your group that writes FlowTotal to this new tag (rather than to a database column).

-create a new expression item of FlowTotal - LastFlowTotal and store that value in the database. This way you will store the difference between the current value and the value from the last time the group ran.

Wouldn’t storing 0s at midnight wipe out the totals from the 11pm-12am hour? If that’s what you need, though, you could add an if statement to the expression item to only do FlowTotal - LastFlowTotal if the hour is not 12am, else 0. You may be better off just logging every hour and avoiding that first hour’s total (from yesterday) and including the next days first hour in the report itself using SQL. Or maybe logging the hour as the previous hour’s total rather than current (2pm value is 2-3pm total)?

Something like this should work in a group that only runs once an hour, and you would just do a straight select on the table for the report. I think it would also work if you added it to the existing group that logs every 5 seconds (just be sure to keep the original flow rate value you are logging as-is). If you did that, you would have to do a sum(FlowRateChange) and group by hour in your reports SQL query. That may also eliminate the whole midnight=0 thing.

If I’m wrong on this, I’m sure one of the IA guys will set me straight :mrgreen:

Good luck!

Dan

Dan,

That’s along the lines of what I was thinking. I think I didn’t explain what they want completely. They don’t want to zero the totalizer at 12AM, they want 12AM to be the zero point on the report. So, whatever value is in the totalizer at 12AM becomes that day’s zero point, and everything else follows from there. It’s a little convoluted, but basically we are working within the limitations of their old control system, so… I’ll start with your suggestions and see what I can come up with. Thanks for the help.

Joe

If your dataset looks like this:

[code]Hour total

0 500
1 600
2 750
3 900[/code]Then you could just pull it in as-is and have a script modify it. On a property change event, you could throw something like this to modify the dataset every time it changes.if event.propertyName == "data": oldData = system.dataset.toPyDataSet(event.source.data) newHeader = ["Hour", "total"] newData = [] midnightValue = oldData[0][1] for row in oldData: newData.append([row[0],row[1]-midnightValue]) event.source.data = system.dataset.toDataSet(newHeader, newData)

Robert,

Thanks, I’ll take a look at that also.

Joe

OK, now that I have all this pretty much sorted out, I have a couple more questions:

  1. Is there any way to set the report to auto fit to one page height, or am I stuck with whatever it fills in to?

  2. After the report prints at midnight, my customer wants the system to automatically save a copy in either PDF or XLS/CSV, without any operator intervention picking folders and such. Any ideas on this?

Thanks everyone.

OK, after I’ve had this running for a while, this is what’s going on. Let’s say I have the following info in my database:

tstamp Flow A Flow B 00:00:00 100 10 01:00:00 150 20 02:00:00 225 20

This is what the customer wants to see:

Time Flow A Flow B 00:00:00 0 0 01:00:00 50 10 02:00:00 75 0

However, this is what I’m getting:

Time Flow A Flow B 00:00:00 100 10 01:00:00 250 30 02:00:00 475 50

Any ideas on this one?

I did get the fit to one page issue figured out, but I could also use some ideas about auto saving to a PDF or XLS/CSV without any operator intervention. Thanks in advance.

–Bump–

Hi,

Looking at this thread it looks like there have been all sorts of different ideas thrown around, so I don’t know exactly what you’re doing, but if the data is in the database in that format, you could do the following:

SELECT a.tstamp, coalesce(a.flowa-b.flowa,0), coalesce(a.flowb-b.flowb,0) from tablename a left join tablename b on b.id=a.id-1;

This assumes you have an id column, which I called “id”. If you don’t, you could use your timestamp column- something like “on b.tstamp=date_sub(a.tstamp, interval 1 hour)”

The coalesce is because the first row won’t have data- it will return nulls. So, it converts the nulls to 0.

Hope this helps,

OK, I’ve played around with the different methods proposed in the thread, and here is what I’ve come up with.

First, as clarification, I’m logging my tags to the database on 15 second intervals. I have a query pulling the timestamp and six tags into a dataset, using Dan’s code to only pull the first reading each hour.

Next, I tried to set up an event script to parse the dataset, do the calculations, and create a new dataset, using Robert’s suggestion. What I’m not completely clear on is this. Robert’s script was as follows:

if event.propertyName == "data": oldData = system.dataset.toPyDataSet(event.source.data) newHeader = ["Hour", "total"] newData = [] midnightValue = oldData[0][1] for row in oldData: newData.append([row[0],row[1]-midnightValue]) event.source.data = system.dataset.toDataSet(newHeader, newData)

Does this only work on one data column, or will it work on all six values in each row without modifications?

I tried Colby’s query, but it times out. I’m assuming because of the number of records that need to be parsed? I still need to try and combine Colby’s and Dan’s queries to pull the first hourly reading, and then coalesce that.

Also, to add another wrinkle, they’ve asked me if I can divide the first three values in each row by 1000, but report the last three values as the above script calculates. Any thoughts on this?

I can only access the system by dial-up connection (ugh) or driving 7 hours (double ugh), so I’m trying to collect a coupe of different options that I can try when I dial in tomorrow.

Thanks for everyone’s help on this!

–bump again–

if event.propertyName == "data":
   oldData = system.dataset.toPyDataSet(event.source.data)
   newHeader = ["Hour", "total"]
   newData = []
   midnightValue = oldData[0][1]
   for row in oldData:
      newData.append([row[0],row[1]-midnightValue])
   event.source.data = system.dataset.toDataSet(newHeader, newData)

I’ve determined that this only works as-is on one data column. Can anyone suggest any modifications to get this to work on multiple columns? Thanks.

It sounds like you might consider:

  1. Replicating this system on a local development system to avoid the frustrating delay.
  2. Purchasing a few hours of design support and letting us give you a hand over the phone w/ GoToMeeting.