Support decimals

Hello

We are doing a conversion of custom software to ignition edge and need to support decimals in our tag names for the legacy system. Since ignition edge doesn't allow decimals in tag names, we need to convert the 34_5 to 34.5 in the sparkplug ddata message. Are pipeline is ignition edge -> hivemq -> ignition engine.

Example

Before: MTR_34ddd5kV_ALT_A_Phase_Current (I put ddd because it's easier to find than _)
After: MTR_34_5kV_ALT_A_Phase_Current

I've tried 2 methods described below and thought of implementing other options. One would be to just go json instead of sparkplug, but that really isn't something anyone wants to do. Also there was another idea floated of modifying the Tahu code base to decode Sparkplug. I don't really like any of these options, so if anyone has a suggestion for another method, I'm all ears

Method: Namespace string conversion

Concern:

  • That is on engine side and the 34.5 wouldn't actually be shown in the ddata when subscribing to spBv1.0/edge/DDATA/WAD1/WAD1.

  • It was always giving errors (likely a config issue)

More Information

  • I'm probably not going to continue down this path unless I was thinking about it incorrectly or if there is a namespace string conversion that can be used at edge instead of engine.

Method: Manually pushing ddata to hive from edge using on tag change and system.cirruslink.transmission.publish

Concern:

  • Had to turn off RPC. Not sure if it'll cause problems

  • It sends as json and not sparkplug_b format

  • I can't find the sparkplug_b

More Information:

  • The code below gives no module available on both edge and engine even though I see other people online find the modules.
try:
    import sparkplug_b as sparkplug
    from sparkplug_b import addMetric, MetricDataType
    from com.cirruslink.sparkplug.message.model import MetricDataType
    from com.cirruslink.sparkplug.message.model.SparkplugBPayload import SparkplugBPayload, Metric
except ImportError as e:
    print(e)
  • What I'm currently doing, but I don't really like the technique. I have a tag change event script on the tags. Also it doesn't quite work for us because it shows up as json and not sparkplug
      import system
      from java.lang import String
      
      # Define MQTT properties
      topic = "spBv1.0/edge/DDATA/WAD1/WAD1"
      qos = 0  # Quality of Service level
      retain = False  # Retain flag for the message
      clientId = "debugClient"  # Provide a client ID for the connection
      
      # Define the original tag name and value (hardcoded for testing purposes)
      original_tag = "AAD1_xxxxxxx_SUB0001_MTR_34ddd5kV_ALT_A_Phase_Current"
      value = currentValue.value  # Example value, replace with dynamic value if needed
      data_type = "Float" # Example data type
      
      # Function to process tag name
      def process_tag_name(tag):
          return tag.replace("ddd", ".")
      
      # Convert the tag name
      converted_tag = process_tag_name(original_tag)
      
      # Create the Sparkplug-like payload
      payload = {
          "timestamp": system.date.now().getTime(),  # Include timestamp
          "metrics": [
              {
                  "name": converted_tag,
                  "timestamp": system.date.now().getTime(),  # Each metric can have a timestamp
                  "dataType": data_type,
                  "value": value
              }
          ]
      }
      
      # Serialize the payload to JSON bytes
      payload_json = system.util.jsonEncode(payload)
      payload_bytes = String(payload_json).getBytes("UTF-8")
      
      # Publish the message to the broker
      system.cirruslink.transmission.publish("HSD", topic, payload_bytes, qos, retain)
  • What I think we need to do on the tag change event, but can't get the modules to import
import system
from java.lang import String
from com.cirruslink.sparkplug.message.model import MetricDataType
from com.cirruslink.sparkplug.message.model.SparkplugBPayload import SparkplugBPayload, Metric

print("Loaded Sparkplug classes successfully.")

# Additional checks can be added after this line to verify each import.

# Proceed with the rest of your code...

# Define MQTT properties
topic = "spBv1.0/edge/DDATA/WAD1/WAD1"
qos = 0
retain = False
clientId = "debugClient"

original_tag = "AD1_xxxxxxx_SUB0001_MTR_34ddd5kV_ALT_Phase_A_KVA"
value = 12345

def process_tag_name(tag):
    return tag.replace("ddd", ".")

converted_tag = process_tag_name(original_tag)

payload_builder = SparkplugBPayload.newBuilder()
timestamp = system.date.now().getTime()

metric_builder = Metric.newBuilder()
metric_builder.setName(converted_tag)
metric_builder.setDatatype(MetricDataType.Int32)
metric_builder.setIntValue(value)
metric_builder.setTimestamp(timestamp)
payload_builder.addMetrics(metric_builder.build())

payload = payload_builder.build()
payload_bytes = payload.toByteArray()

system.cirruslink.engine.publish("HSD", topic, payload_bytes, qos, retain)

Thank you!!!

Are these atomic tags, or UDT instances ?

With UDT instance, you could add a string parameter with the original name, with the dots.
Then retrieve that parameter when you need the original name.

Or is there a reason that wouldn't be enough ?

That wouldn't work because we need to have the 34.5 show in the ddata message. We have an AVRO converter that takes those ddata messages to support the legacy system