How to understand the scale mode of a tag

I‘m not fully understand what‘s the meaning of scale mode for a tag.
In the user manual, it says the scale mode stands for “if and how the tag value will be scaled between the source, and what is reported for the tag.”

my understanding is:
if scale mode is Off (0), means there is no scaling, the data from source will be directly reported for this tag. they are totally same (1:1)

if scale mode is Linear(1), means there is a linear mapping, the raw low will be mapped to scaled low and the raw high will be mapped to scaled high. Just as below screen shot, as raw high is 10.0 and scaled high is 1.0, means the source data will be multiplied with 0.1

I'm not sure if my understanding is right ? anyone can help to correct me if I'm wrong. Thanks!

Your understanding is correct. Also, the scaling will extrapolate out beyond the low and high values. You can think of it as the equivalent of y = mx + c in cartesian coordinates.

You can test this easily to understand its operation.

  • Create a memory tag, myFloat of type float.
  • Create a derived tag, myFloatScaled also of type float. Set the Read Expression to {[~]myFloat} (or select it using the Insert Tag button).

Now edit the myFloat value in Tag Browser and you should see the scaled value update in myFloatScaled.

2 Likes

Thanks for your explanation! I tried as you suggested and now it's quite clear! :slight_smile: