I am in the process of designing a screen for the supervisors that would give runtime and efficiencies. I was having some trouble coming up with a SQL query that would do what I wanted so I found a work around that seems to work but wanted to present it here to see if there is anything that might come back and bite me in the behind.
First off the table that I needed to acquire the time frame from was quite large and grows more every day. this is the historical table. As I have come to find out the larger the table the more sluggish the query is when ran against it, even with indexes and keys.
So the first thing i done was created a view on the SQL server. This view only looks at the past 48 hours of historical data. SO now instead of over 1million records now it is under 50K.
Then in my FSQL for each group that required the time I created a query that ran against this view and then stores the information back into the main historical table.
It seems to run ok and thus far our SQL server resources have been running about 50% which is kind of high, most places seem to say that it should not go over 30%. So does that mean that this way is incorrect? Is there a better way?
Thanks and have a great day.