Scripting modules and protobuf

I’m developing modules for Ignition 8.3.4 using IntelliJ 2025.2.1 at the moment. I’m having difficulties passing a dictionary/hashmap object as a parameter from the designer/client to the gateway. Dictionaries are flat, but may contain non-primitive types such as java.math.BigInteger and java.util.Date.

Serialization/deserialization seem to work fine, but there’s an “argument type mismatch” exception (or similar) in RpcRoutes before it gets to the implementation. According to the stack trace, it’s happening in RpcDelegate$DelegateRpcHandler.handle. My line numbers do not line up with the call stack, but I can at least hit breakpoints in the latter.

The primary problem is that IntelliJ is refusing to decompile the library with RpcRoutes when trying to step into it. When I search my set of external libraries, I don’t find any matching functions for the call stack. So, I’m likely missing something.

  1. What jar hosts com.inductiveautomation.ignition.gateway.rpc.RpcRoutes?

  2. Is there a best-practice preferred class to use for dictionary type param?

  3. The debugger tells me that it’s serializing the hashmaps as Collections$UnmodifiableMap. My params are all Hashmap. Might this be the issue?

What does your ProtoRpcSerializer look like?

ProtoRpcSerializer.newBuilder()  
  .addGsonAdapter(SqlValue.class, new SqlValueAdapter())
  .addGsonAdapter(java.math.BigInteger.class, new BigIntegerAdapter())
  .addProtoAdapter(Dataset.class,
                   DatasetProto.DatasetPB::parseFrom,
                   ds -> new DatasetSerializerPlus().toProtobufMessage(ds),
                   dsPB -> new DatasetSerializerPlus().fromProtobufMessage(dsPB))
  .build();

Are you calling pyToJava on your dictionaries?

1 Like

IIRC, from another post, v8.3's default RPC serialization always delivers unmodifiable lists and maps. You'll have to inject your own handlers.

1 Like

RpcRoutes is in the gateway jar, where "implementation" classes live. Not part of public API, though we certainly can't stop you from decompiling it (and we're making no intentional effort to obfuscate/make it more annoying to decompile).

Yes; don't.
If you must, restrict it down to things you can marshal down to a 'known-safe' format such as JSON - e.g. only accept dictionary-likes containing a restricted set of primitives. If you go this route, make your RPC interface parameters com.inductiveautomation.ignition.common.gson.JsonObject and you'll get easy low-overhead lossless serialization for free with ProtoRpcSerializer. Use TypeUtilities.pyToGson to do a recursive deep-copy of Python objects into GSON objects.

If you really must accept dictionaries containing "real" Jython objects (e.g. you need to serialize functions/lambdas/arbitrary classes), then you'll have to delegate to Java serialization or some other mechanism. ProtoRpcSerializer gives you a couple of different avenues to approach this, e.g. ObjectSerializers. Another option is wrapping the specific payloads you need to package into an "envelope" you decide your own serialization strategy for; this is what we do for named queries in the platform, because of backwards compatibility constraints:

@RpcInterface(packageId = "NamedQueries")
public interface NamedQueryRpc {

  RpcSerializer SERIALIZER =
      ProtoRpcSerializer.newBuilder()
          .addBinaryAdapter(
              JavaSerializedPayload.class,
              (serialized, context) -> serialized.payload,
              (bytes, context) -> new JavaSerializedPayload(bytes))
          .build();

  JavaSerializedPayload execute(
      @NonNull String project,
      String queryPath,
      JavaSerializedPayload parameters,
      String tx,
      boolean getKey)
      throws Exception;

  JavaSerializedPayload executeUnsaved(
      @NonNull String project,
      NamedQuery query,
      JavaSerializedPayload parameters,
      boolean canCache,
      boolean canLimit,
      String tx,
      boolean getKey)
      throws Exception;

  boolean executeSFQuery(@NonNull String project, String path, JavaSerializedPayload parameters)
      throws Exception;

  record JavaSerializedPayload(byte[] payload) {
    @Override
    public boolean equals(Object o) {
      if (!(o instanceof JavaSerializedPayload javaSerializedPayload)) return false;

      return Arrays.equals(payload, javaSerializedPayload.payload);
    }

    @Override
    public int hashCode() {
      return Arrays.hashCode(payload);
    }

    @Override
    public String toString() {
      return "JavaSerializedPayload[payload=%d bytes]".formatted(payload.length);
    }
  }

  static JavaSerializedPayload wrap(Object any) {
    if (any instanceof Dataset dataset) {
      any = new BasicDataset(dataset);
    }
    return new JavaSerializedPayload(SerializationUtils.serialize((Serializable) any));
  }

  static <T> T unwrap(JavaSerializedPayload javaSerializedPayload) {
    return SerializationUtils.deserialize(javaSerializedPayload.payload);
  }
}

If you pass an arbitrary map, list or set into ProtoRpcSerializer, it will attempt to serialize it (falling back to GSON's reflective serialization by default) and send the elements to the other side along with a collection type hint. Then on the receiving side the elements will be deserialized and packed into an immutable collection of the same type (list/map/set). No attempt is made by ProtoRpcSerializer to preserve actual collection implementation type (e.g. EnumMap vs HashMap vs ConcurrentMap).

2 Likes

Lots of good advice allowed me to get to a good solution on this one. Thank you @pturmel for confirming this is a real issue that we needed to handle, not just pilot error on the programmers’ part.

The key from @paul-griffith is that json data structures have built-in serialization with the gson adapters. So one can pass a PyDictionary as a JsonObject, a PyList as a JsonArray, etc. Note that nulls are passed as JsonNull, but all of these types derive from JsonElement, so you can check for nulls by checking the class type. Thank you @bmusson for directing me to the TypeUtilities.

The serialization method takes in Jython objects as parameters, converts them to the appropriate gson object, uses the default serialization/deserialization for gson, then converts the gson object to the native java objects for use in the modules. Example below.

protected JsonObject SafePyDictionaryToJsonObject(PyDictionary pyDictionary) {
    JsonElement je = TypeUtilities.pyToGson(pyDictionary);
    if (!(je instanceof JsonObject))
        je = new JsonObject();
    return (JsonObject)je;
}

The gson serialization is only ‘mostly’ free. It will incorrectly convert data types: int→long, long→bigint, long→double and likely others that I haven’t tripped over yet. With the strong type-checking in Java, you’ll likely have to cast something back. Thankfully, the boxed primitives all derive from Number and allow a simple conversion back to the appropriate type.

Long locationId = ((Number)o).longValue();

BigInteger required its own adapter. If there was a built-in path to get it, I did not discover it.

The 8.3 module update process is not complete, but the serialization seems solid. I am concerned that for conditions that really should fail, the new serialization paths do raise some different exception types than before, so there’s a potential that we’ll have to change handlers that catch specific Exception types.

The other serialization things we’ve learned:

  1. Passing custom objects requires custom serialization adapters
  2. You can’t overload calls through the serializer. You can keep them overloaded for scripting interfaces, but to make the jump to a remote environment, it now must be unique.
  3. Don’t use the PyObject serializer, you’ll lose money on that bet.
  4. We implemented a value serializer to pass a single simple value. Now that I know the gson method, I suspect JsonPrimitive would have worked. Might have to go back and try it…
  5. You can pass simple recordset/dataset objects. You’ll have to create an adapter class. Since StreamingDatasets are not supported by the built-in basic dataset serializer. Use something like this and add your own custom handling in the conversion routing.
public DatasetPB toProtobufMessage(Dataset dataset) throws ProtobufSerializationException {
    if (dataset instanceof BasicStreamingDataset bsd) {
        dataset = convertStreamingToBasic(bsd);
    }
    if (dataset instanceof BasicDataset bd) {
        return DatasetPB.newBuilder().setBasicDataset(this.basicDatasetSerializer.toProtobufMessage(bd)).build();
    } else {
        throw new ProtobufSerializationException("No serializer available for dataset class '%s'".formatted(dataset.getClass().getName()));
    }
}
2 Likes

This and DatasetPB in your code snippet below make me suspect you're using things from com.inductiveautomation.ignition.common.protocolbuffers. Despite their location, Gateway <-> Designer/Client RPC doesn't use these at all, outside of the tag system doing their own thing as of 8.3.5. Lots of these protobufs won't work without implicit context, i.e. the centralized registry used for gateway network protobuf serialization. RPC uses a fundamentally different approach, so despite both being nominally protobuf there's not a lot of common code. Concrete classes could probably use these common protobufs as their wire format, but would need specific serializers/deserializers for use in RPC.

Note that a lot of this can be customized. The innermost loop of pyToGson (which has an overload to accept a custom GSON instance) looks like this:

      Object object = pyObject.__tojava__(Object.class);
      if (object instanceof JsonElement jsonElement) {
        element = jsonElement;
      } else if (object instanceof Dataset dataset) {
        return datasetToGson(dataset);
      } else {
        Gson gson = Objects.requireNonNullElseGet(customGson, Gson::new);
        element = gson.toJsonTree(object);
      }

So if you customize your GSON instance to use more specific number types that should be preserved. Though, I suppose it's worth nothing ProtoRpcSerializer will not preserve e.g. short - you'll always end up with at least an int on the other side:

  oneof value {
    bool bool_value = 2;
    sint32 int_value = 3;
    sint64 long_value = 4;
    float float_value = 5;
    double double_value = 6;
    string string_value = 7;
    bytes binary_value = 8;
    ValueCollection collection_value = 9;
  }

Note that if you declare your RPC methods as throwing some custom exception, and that exception is available to your serializer on both sides, it will be faithfully passed through. If you pass something that can't be "rehydrated" on the other side, you'll get a ProtoWrappedException. You can also use the now checked RpcException in place of the older GatewayFunctionException if you were relying on the "message code" based control flow via exception before.

3 Likes