Hello everyone,
I'm relatively new to Ignition and industrial automation (it's not technically my job at our company either) but I've been given the task due to circumstances and want to make sure I'm taking the right approach and using best practices; which is a little nerve racking when I'm newer to this kind of scripting!
Recently, with help from people in this forum with far more knowledge than myself, I managed to resolve an issue related to posting data from a Gateway Tag Change Event to a webhook. You can find the previous thread here.
Since the previous post, our strategy has changed a bit. Our current script is working, but we want to optimize it and add authentication.
Initially, we were posting data on every tag change, but due to the high frequency of changes (sometimes within a few milliseconds), this approach was not sustainable. Our third-party integration requires min, max, and mean values for their workflow, so we modified our strategy to batch process these values every 5 minutes.
How It All Works Together
I have implemented three scripts to handle the collection, calculation, and posting of tag data to the webhook:
- Project Script Library (
postTagStats.py
):
- Initialization: Initializes data storage for each tag.
- Data Collection: Collects data whenever a tag changes and updates the min, max, sum, and count for each tag.
- Statistics Calculation: Calculates the min, max, and mean for each tag using the collected data.
- Data Posting: Posts the calculated statistics to the webhook every 5 minutes.
- Timer Script (
PostTagStatsTimer
):
- Purpose: Triggers the posting of statistics every 5 minutes.
- Function Call: Calls the
post_statistics
function from the project script library.
- Tag Change Script (
postTagStatsCollect
):
- Purpose: Collects data whenever there is a change in the specified tags.
- Function Call: Calls the
collect_data
function from the project script library.
Updated Script
Here’s the updated script that collects data, calculates the required statistics, and posts the data every 5 minutes:
postTagStats.py *(project library script)*
# postTagStats.py
import system
from java.lang import Throwable
import base64
# Create the HTTP client once, to be reused
client = system.net.httpClient()
# Initialize data storage
data_storage = {}
def initialize_data_storage(tag_paths):
global data_storage
for tag in tag_paths:
data_storage[tag] = {
"min": None,
"max": None,
"sum": 0,
"count": 0
}
def collect_data():
tag_paths = [
"[default]NaturalGas/Hot_H2O_Tnk_1_Lvl",
"[default]NaturalGas/Hot_H2O_Tnk_1_Tmp",
"[default]NaturalGas/Hot_H2O_Tnk_2_Lvl",
"[default]NaturalGas/Hot_H2O_Tnk_2_Tmp",
"[default]NaturalGas/Cold_H2O_Tnk_3_Lvl",
"[default]NaturalGas/Cold_H2O_Tnk_3_Tmp"
]
# Ensure data_storage is initialized
if not data_storage:
initialize_data_storage(tag_paths)
# Read the tag values
values = system.tag.readBlocking(tag_paths)
# Collect the data
global data_storage
for i, tag_path in enumerate(tag_paths):
value = values[i].value
if tag_path not in data_storage:
system.util.getLogger("Losant Data Publisher").errorf("Tag path %s not found in data storage", tag_path)
continue
tag_data = data_storage[tag_path]
tag_data["min"] = value if tag_data["min"] is None else min(tag_data["min"], value)
tag_data["max"] = value if tag_data["max"] is None else max(tag_data["max"], value)
tag_data["sum"] += value
tag_data["count"] += 1
def calculate_statistics():
global data_storage
statistics = {}
for tag, data in data_storage.items():
if data["count"] > 0:
statistics[tag] = {
"min": data["min"],
"max": data["max"],
"mean": data["sum"] / data["count"]
}
return statistics
def post_statistics():
logger = system.util.getLogger("Losant Data Publisher")
try:
# URL and headers for the webhook
url = 'https://example.com/our-webhook-endpoint'
username = "your_username"
password = "your_password"
credentials = base64.b64encode(f"{username}:{password}".encode('utf-8')).decode('utf-8')
headers = {
"Content-Type": "application/json",
"Authorization": f"Basic {credentials}"
}
# Get the current time in the specified format
current_time = system.date.now()
formatted_time = system.date.format(current_time, "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'")
# Calculate statistics
statistics = calculate_statistics()
# Prepare the final JSON payload
payload = {
"data": statistics,
"time": {"$date": formatted_time}
}
# Send the HTTP POST request
response = client.post(url, headers=headers, data=system.util.jsonEncode(payload))
# Log the response from the server
if response.good:
logger.infof("HTTP POST request sent to Losant successfully. Status Code: %s, Response: %s", response.statusCode, response.text)
else:
logger.errorf("HTTP POST request failed. Status Code: %s, Response: %s", response.statusCode, response.text)
# Reset data storage after posting
initialize_data_storage([
"[default]NaturalGas/Hot_H2O_Tnk_1_Lvl",
"[default]NaturalGas/Hot_H2O_Tnk_1_Tmp",
"[default]NaturalGas/Hot_H2O_Tnk_2_Lvl",
"[default]NaturalGas/Hot_H2O_Tnk_2_Tmp",
"[default]NaturalGas/Cold_H2O_Tnk_3_Lvl",
"[default]NaturalGas/Cold_H2O_Tnk_3_Tmp"
])
except Throwable as t:
# Log any Java exceptions that occur
logger.error("Java Error sending data to Losant", t)
except Exception as e:
# Log any Jython exceptions that occur
logger.error("Jython Error sending data to Losant: " + str(e))
# Initialize data storage with the tag paths
initialize_data_storage([
"[default]NaturalGas/Hot_H2O_Tnk_1_Lvl",
"[default]NaturalGas/Hot_H2O_Tnk_1_Tmp",
"[default]NaturalGas/Hot_H2O_Tnk_2_Lvl",
"[default]NaturalGas/Hot_H2O_Tnk_2_Tmp",
"[default]NaturalGas/Cold_H2O_Tnk_3_Lvl",
"[default]NaturalGas/Cold_H2O_Tnk_3_Tmp"
])
Timer Script: PostTagStatsTimer
# Timer Script to post statistics every 5 minutes
# Import the function from the project script library
import postTagStats
# Call the function to post statistics
postTagStats.post_statistics()
Tag Change Script: postTagStatsCollect
# Gateway Tag Change Script
# Import the function from the project script library
import postTagStats
# Call the function to collect data
postTagStats.collect_data()
Questions and Feedback
- Data Validation: Are we accurately calculating the min, max, and mean values? Are there any best practices or improvements we should consider? Other than logging every single calculation, I'm not sure how we can really verify our numbers are correct -- so I suppose we just have to trust th
2. Resource Efficiency: Our gateway doesn't seem overly taxed, but we can't find a breakdown of the resources that this script alone is using (memory, CPU, etc.). Any guidance on monitoring the resource usage of this script within Ignition would be appreciated. I can find the general usage, but not sure if something more granular is available in the dashboard(s)?
**This might be a loaded question! Our memory trends were high like this before implementing this script and we haven't seen adverse effects I don't believe, but don't want this script to be a tipping point either!
3. Authentication: We've added authentication to the HTTPS POST request headers. Is our approach correct for incorporating authentication?
Any insights or suggestions would be greatly appreciated. Thank you!
Further reading for those interested -- don't want to make this post too long but this is how I did the breakdown to explain how it was being calculated in the script -- I think my math is correct but someone who really likes to read and knows more can correct me if I'm wrong!
Explanation of Mean, Min, and Max Calculations
In our script, the mean, min, and max values are calculated in the collect_data
and calculate_statistics
functions. Here’s a detailed breakdown of how each part of the script works:
1. collect_data
Function
This function is called whenever there is a tag change. It updates the min, max, sum, and count for each tag.
def collect_data():
tag_paths = [
"[default]NaturalGas/Hot_H2O_Tnk_1_Lvl",
"[default]NaturalGas/Hot_H2O_Tnk_1_Tmp",
"[default]NaturalGas/Hot_H2O_Tnk_2_Lvl",
"[default]NaturalGas/Hot_H2O_Tnk_2_Tmp",
"[default]NaturalGas/Cold_H2O_Tnk_3_Lvl",
"[default]NaturalGas/Cold_H2O_Tnk_3_Tmp"
]
# Ensure data_storage is initialized
if not data_storage:
initialize_data_storage(tag_paths)
# Read the tag values
values = system.tag.readBlocking(tag_paths)
# Collect the data
global data_storage
for i, tag_path in enumerate(tag_paths):
value = values[i].value
if tag_path not in data_storage:
system.util.getLogger("Losant Data Publisher").errorf("Tag path %s not found in data storage", tag_path)
continue
tag_data = data_storage[tag_path]
tag_data["min"] = value if tag_data["min"] is None else min(tag_data["min"], value)
tag_data["max"] = value if tag_data["max"] is None else max(tag_data["max"], value)
tag_data["sum"] += value
tag_data["count"] += 1
- Min Calculation: The
min
function is used to update the minimum value if the new value is smaller. - Max Calculation: The
max
function is used to update the maximum value if the new value is larger. - Sum and Count: The
sum
of the values is accumulated, and thecount
of values is incremented by 1.
calculate_statistics
Function
This function calculates the mean, min, and max values based on the collected data. It is called every 5 minutes by the timer script.
def calculate_statistics():
global data_storage
statistics = {}
for tag, data in data_storage.items():
if data["count"] > 0:
statistics[tag] = {
"min": data["min"],
"max": data["max"],
"mean": data["sum"] / data["count"]
}
return statistics
- Mean Calculation: The mean is calculated by dividing the
sum
of the values by thecount
of the values. - Min and Max: These values are directly taken from the
data_storage
where they were updated during data collection.
How It All Works Together
- Data Collection: When a tag changes, the
collect_data
function updates the min, max, sum, and count for that tag. - Statistics Calculation: Every 5 minutes, the
calculate_statistics
function calculates the mean, min, and max for each tag using the collected data. - Data Posting: The
post_statistics
function sends the calculated statistics to the webhook.
Summary
- The
collect_data
function updates the min, max, sum, and count for each tag on every change. - The
calculate_statistics
function computes the min, max, and mean values based on the collected data every 5 minutes. - The
post_statistics
function posts the data to the webhook, ensuring that the third-party integration receives the required statistics.