This topic came up a lot during training sessions. The main reason I've found that causes this is actually in the tag historical configuration, specifically the deadband style
property.
There are three options, auto (default), Analog and Discrete.
Auto will choose either Analog or Discrete based on the tag type. Basically anything with a floating point is set to analog.
An analog deadband style always assumes a linear path. Think of temperature, it's impossible to jump from one temp to another without transitioning. So if we assume linear interpolation, that helps remove historical points that fall on a linear gradient.
If a tank was heated over a day with a linear rate of change, only two points actually need to be recorded in the historian to recreate the historical trend, the start and end points. All other values between can be assumed with a linear interpolation. If there were any significant deviations in the day then the analog deadband would have triggered a point to be stored.
This is also the reason why some power charts seem to interpolate to the current value. Only when a new sample is taken will the trend update the previous sample. It actually doesn't store a point until a new point is recorded, at that time it checks to see if the new point and old point are within a linear gradient from the previously stored point. If they all line up on a linear gradient then the previous point is discarded and the new point is stored in a holding register no updates are stored in the historian. Only when a new point doesn't line up with the previous (within the deadband) then the previous point is inserted into the historian and the new point is stored in a holding register.
The best way to prevent these is to either set the maximum time between samples
. Or set the Deadband style
to Discrete.