ModuleRPCFactory to ProtoBuf

This may have already been answered, but I am having a hard time finding it. We are using the ModuleRPCFactory.create in multiple modules to expose Java classes and scripts to our Ignition Vision / Perspective projects.

What is the most direct and built-in way to replace that in 8.3.0-beta3 and moving forward. We have multiple modules that will need this change done to support 8.3 and want to find the most stream-lined way to do it.

There's no direct replacement.

For your protobuf requirements, you will need to use the ProtoRpcSerializer builder as described in the migration document.

1 Like

Thank you and forgive me.

Where is the migration document?

Hmmm. Good question. I downloaded it from Google Docs back in the alpha period. I just skimmed this category and it didn't jump out at me. It was in a pinned topic before the beta was released. I vaguely recall it being moved... Perhaps IA staff can chime in.

(I can't share alpha stuff, sorry.)

I'll reproduce the full 'dev introduction' document here - I wrote the thing, and there's nothing sensitive here. It'll eventually end up in the SDK docs, but no idea when.

This is more general context about the whole thing and the technical details than purely a migration guide from 8.1 -> 8.3. The closest thing to a 'tl;dr' is the last ~third of part 2.

Part 1

Context

Ignition has, since its inception, used a relatively straightforward RPC mechanism that uses XML and Java serialization over HTTP.
In 8.3, we’re taking advantage of lessons learned and significantly revamping this system. The new RPC system will be serialization format agnostic (pushing the choice of technologies down to module authors, with a first party emphasis on Protobuf and GSON), and continue to use HTTP as the primary transport mechanism, with the addition of a WebSocket layer for continuous health checks and gateway pushed events.

Implementation

The roots of the new system should be broadly familiar to anyone used to implementing RPC in modules, but with some significant changes.

Core

RpcCall

public record RpcCall(
    @NotNull String moduleId,
    @NotNull String packageId,
    @NotNull String function
) {}

The most basic unit in the RPC system is com.inductiveautomation.ignition.common.rpc.RpcCall - a simple nominal triple of (moduleId, packageId, functionName), designed to act as a common locating layer for everything else to build upon. It’s extremely important to note that the packageId is an entirely arbitrary string, only intended to serve as a namespace within a particular module - e.g. the Ignition platform might define packages like images, databases, projects, etc, and it need not have any correlation to any actual Java package ID or otherwise. Its only purpose is to act as a namespace within a module.
This is possible because of an explicit separation of concerns in the new RPC system - as long as you’re using established channels, Ignition owns the transport layer and the headers, but the entire request/response body is the responsibility of your module. More specifically - an outgoing RPC call from the designer will pack the module ID, package ID, and function name into HTTP headers. Then, on the gateway, a common handler will unpack those headers, locate the appropriate module handler, and immediately delegate to it for handling. This allows modules the flexibility to use whatever serialization format they want and ensures that the platform is not being overly proscriptive.

RpcInterface

public @interface RpcInterface {
    @NotNull String packageId();
}

A simple marker annotation to be added to your interfaces in common. The PackageID is an arbitrary key to map to a specific RPC implementation within your module, which is important if you plan to have multiple RPC interfaces defined. Even if you only have one, it cannot be an empty string and should be something human readable for logging/diagnostic purposes.

RpcException

RpcException is a simple exception wrapper defined in common that allows you pass an int constant from GatewayConstants. This should generally not be necessary, but is available as a backwards compatibility shim to minimize rewriting of exception handling in the client/designer.
Note: RpcException can also be thrown implicitly during initial RPC capability checks; see RpcHandler for more detail.

GatewayRpcSerializer

public interface GatewayRpcSerializer {
    void writeReturnValue(OutputStream stream,
                          RpcCall call,
                          Object value) throws IOException;

    @Nonnull
    List<Object> readParameters(InputStream stream,
                                RpcCall call) throws IOException;
}

This is a shared instance of a class that’s expected to be able to read incoming parameters directly from the incoming request, translate them to a simple list of objects, and perform the corresponding operation in reverse - writing the value returned from an RpcHandler back to the request as output. The RpcCall parameter is provided purely for informational purposes, such as for logging, or in case a given serialization scenario requires differentiation. While it’s certainly possible to implement manually, the recommended pattern is to take advantage of the work registering serializers for common types we’ve already done - for more info on that, see the Protobuf section below.

ClientRpcSerializer

public interface ClientRpcSerializer {
    void writeParameters(OutputStream stream, RpcCall call, Object... parameters) throws IOException;

    Object readReturnValue(InputStream stream, RpcCall call) throws Exception;
}

The inverse interface to GatewayRpcSerializer - responsible for serializing outgoing parameters on a call, and parsing the return value from the gateway. Both of these interfaces are defined in common scope so that a shared class suitable for RPC exchange can be defined in your module’s common scope, and then separately provided to calls on the gateway and client/designer.

RpcSerializer

RpcSerializer is a simple meta-interface combining Gateway and Client, for situations where you want to easily declare a field/parameter/etc as performing both.

Gateway

RpcHandler

@FunctionalInterface
public interface RpcHandler {
    @Nullable
    Object handle(@Nonnull RpcContext context,
                  @Nonnull List<Object> parameters) throws Throwable;

    default boolean isActiveNodeRequired() {
        return true;
    }

    default @Nullable String clientPermissionId() {
        return null;
    }

    default MutabilityMode requiredMutabilityMode() {
        return MutabilityMode.OFF;
    }
}

RpcHandler is the simple functional interface that forms the basis of RPC handling on the gateway. However, for purposes of code navigation and type safety, it will not usually be implemented directly. The list of parameters delivered to your handler is guaranteed to be deserialized by your module’s serializer, and the context object will be provided for you.

Implementations can opt in to various global checks on client state by overriding the default methods.

  • isActiveNodeRequired() must return false to allow the handler to run on backup nodes
  • clientPermissionId() is required to allow the function to run on clients; null means that only designers can run handlers. ClientPermissionsConstants has a new UNRESTRICTED constant that means no permissions are expected of clients, the previous default behavior.
  • requiredMutabilityMode() allows RPC methods to opt in to behavior restrictions based on the client/designer’s “Comm Mode” setting. Most commonly, a particular function could require a READ_WRITE mutability mode, so a client/designer in READ_ONLY mode will not be allowed to invoke the function.

Note: If any of the above checks fail, an RpcException will be thrown and handed to your serializer(s) to write back to the caller. If you are following the recommended proxy based approach, this will throw an UndeclaredThrowableException that contains the RpcException as its cause, unless you have your RPC interface methods throw RpcException directly.
Note: If the handle method returns null, nothing will be written out to the response stream; this allows for streaming responses to write out their response lazily as results are returned. See com.inductiveautomation.ignition.gateway.servlets.gateway.ProtoStreamingDatasetWriter for a possible use case for this.

RpcContext

public interface RpcContext {
    RequestContext request();

    HttpServletResponse response();

    ClientReqSession session();

    /**
     * The name of the project the RPC call is being made from,
     * or null if the caller is not (yet) associated with a project.
     */
    @Nullable String projectName();

    RpcCall rpcCall();
}

RpcContext is a basic interface that provides metadata to a handler, as well as direct access “escape hatches” to the underlying request and response - this is how you could actually write out a streaming response as mentioned above.

RpcImplementation

public interface GatewayRpcImplementation {
    GatewayRpcSerializer getSerializer();

    RpcRouter getRpcRouter();

    static GatewayRpcImplementation of(GatewayRpcSerializer serializer, Object... interfaces) {
        return new GatewayRpcImplementation() {
            @Override
            public GatewayRpcSerializer getSerializer() {
                return serializer;
            }

            @Override
            public RpcRouter getRpcRouter() {
                return new RpcDelegate(interfaces);
            }
        };
    }

    static GatewayRpcImplementation.Builder newBuilder(GatewayRpcSerializer defaultSerializer) {
        return new Builder(defaultSerializer);
    }
}

On the Gateway, your entrypoint into the new RPC system will be the getRpcImplementation() function on your GatewayHook. This completely replaces the getRPCHandler method. Your module’s RPC implementation must do two things:

  1. Return a serializer that’s capable of
    1. Deserializing incoming parameters from the client/designer
    2. Serializing the outgoing return value to send to the client/designer
  2. Return an RpcRouter that can be used to locate an individual RPC function implementation.

Typically, the easiest path forward for this is to define one using the GatewayRpcImplementation.of() method to build (during module setup) the implementation which is returned by your module hook.
In rare cases (or for the Ignition platform itself), it makes sense to have dedicated serialization handlers for specific RPC ‘packages’, rather than one overloaded mega-serializer. To that end, newBuilder exists; you register a default serializer, and then add packages to your implementation, optionally providing a more specific serializer for each package.

RpcRouter

public interface RpcRouter {
    @Nonnull
    Optional<RpcHandler> getRpcHandler(RpcCall call);
}

RpcRouter is a simple indirection interface, representing the task of locating a particular RpcHandler for a given RpcCall.
In most cases, it’s not necessary to implement directly, because of the following two helper classes:

RpcDelegate

public class RpcDelegate implements RpcRouter {
    public RpcDelegate(Object delegate) {

RpcDelegate, meanwhile, completes the locating tree. When constructing an RpcDelegate, you pass in any number of implementation classes which each implement one shared common-scoped Java Interface which is annotated with @RpcInterface(packageId = “some-package”). All methods on those interface(s) will be pulled out reflectively and wrapped as RpcHandlers, allowing RpcDelegate to act as an RpcRouter by package ID (from the interface annotation) and function name (by the explicit function’s name).
Note: overloads are not supported - there is no differentiation by types in the RPC system. If you need two similar methods, you must give them distinct names.

RpcDelegate also conveys the RpcContext of the current call via the CURRENT_CONTEXT ThreadLocal, allowing implementations of an otherwise common interface access to the extra features of the RPC system. RpcDelegate also exposes simple static methods to retrieve the individual members of RpcContext directly:

public static RequestContext getRequest()
public static HttpServletResponse getResponse()
public static ClientReqSession session()
public static String projectName()
public static RpcCall rpcCall()

Finally, RpcDelegate also defines annotations to attach capability restrictions/additional behavior to the generated RpcHandler:

/**
* Annotate a particular method on your RPC <b>implementation</b> with {@code RunsOnBackup} to allow it to be
* invoked on a backup node.
*/
@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.METHOD)
public @interface RunsOnBackup {}

/**
* Annotate a particular method or your entire RPC <b>implementation</b> with {@code RunsOnClient} to allow it to be
* invoked by a client.
*/
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.METHOD, ElementType.TYPE})
public @interface RunsOnClient {
	/**
 	* @return A non-null, non-empty string identifying the client permission required to invoke this method.
 	* Use {@link ClientPermissionsConstants#UNRESTRICTED} to allow any client to invoke the method.
 	*
 	* @see ClientPermissionsConstants
 	*/
	String clientPermissionId();
}

/**
* Annotate a particular method on your RPC <b>implementation</b> with {@code RequiredMutabilityMode} to indicate
* that the client must have at least the given {@link MutabilityMode}, or the call will be rejected.
*
* @see RpcHandler#requiredMutabilityMode()
*/
@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.METHOD)
public @interface RequiredMutabilityMode {
	MutabilityMode value();
}

Client/Designer

GatewayConnection

The same GatewayConnection interface can be obtained and used as in the old system, via GatewayConnectionManager.getInstance(), and this exposes an invoke entrypoint much like the old system. However, there is also a new static method on GatewayConnection used to obtain a proxy instance of an interface. This should be broadly familiar to anyone used to the ModuleRPCFactory class that previously existed:

<T> T getRpcInterface(ClientRpcSerializer serializer,
                      String moduleId,
                      Class<T> rpcInterface,
                      int timeoutMillis); // an overload is also available with a default timeout

This allows any component in your UI to obtain a direct instance of an interface; more on this in Best Practices later.
Note that this ‘shorthand’ method requires that the class you pass in is an interface annotated with @RpcInterface, to obtain a package ID.
Ignition platform RPC proxy instances, for ease of reuse, are available in com.inductiveautomation.ignition.client.rpc.PlatformRpcInstances

Protobuf

Throughout the Ignition platform, we’ve decided to drop Java serialization as much as possible (backwards compatibility constraints excluded). As a result, we’ve settled on a hybrid model for first party RPC serialization that uses Protobuf as the wire format, with support for nested JSON encoding via Gson, since many first party classes already had Gson serializers set up for other work.

RpcMessage

/**
An RpcMessage is always either an actual value, or an error thrown on the gateway; nothing more and nothing less.
 */
message RpcMessage {
  oneof value {
    Value actual = 1;
    Error error = 3;
  }
}

RpcMessage is the overall container of a value coming over RPC - it is essentially an Either<Value, Throwable> Protobuf message.

Value

message Value {
  // An associated identifier that will be used to pick up custom deserialization logic on the receiving side.
  // If no identifier is supplied, the value is decoded as whatever underlying Java type.
  optional string identifier = 1;

  oneof value {
    bool bool_value = 2;
    sint32 int_value = 3;
    sint64 long_value = 4;
    float float_value = 5;
    double double_value = 6;
    string string_value = 7;
    bytes binary_value = 8;
    ValueCollection collection_value = 9;
  }

  message ValueCollection {
    repeated Value value = 1;

    optional Implementation implementation = 2;

    enum Implementation {
      LIST = 0;
      SET = 1;
      MAP = 2;
      BOOL_ARRAY = 3;
      INT_ARRAY = 4;
      LONG_ARRAY = 5;
      FLOAT_ARRAY = 6;
      DOUBLE_ARRAY = 7;
      BOXED_ARRAY = 8;
    }
  }
}

Value is the atomic unit of serialization. A Value is either a basic Protobuf primitive, raw binary bytes, or a sequence of (recursive) Value objects. Each Value object has an optional associated identifier; if supplied, this will be used to look up a special deserialization handler on the receiving side. In this way, a class can be ‘decomposed’ down to a primitive type (e.g. a java.util.Date can be sent as long_value, but decoded as a Date on the receiving side).

binary_value is intended as an “escape hatch” of sorts, or a means to gain greater flexibility. It can be used in a scenario where the platform or your module’s overall Protobuf support is adequate for all but a few special cases which might need their own special serialization handling. You can serialize and deserialize your object using any arbitrary strategy, and only have to provide a stable identifier that can be passed to the other side of the RPC connection and understood.
See also ObjectSerializers for a Java-serialization oriented path.

ValueCollection, meanwhile, is a repeated sequence of Values, allowing recursive self-definition, along with an Implementation enum hint. This allows complex heterogeneous structures like JsonObject/JsonArray to be sent, as well as standard primitive arrays and collection types.

2 Likes

Hit the forum post character limit, so, continued:

Part 2

ProtoRpcSerializer

The entrypoint to this new serialization model, ProtoRpcSerializer is a base class defined in common scope, explicitly designed to be extended by sub implementations. ProtoRpcSerializer has a newBuilder method that returns a new ProtoSerializerBuilder instance. It also implements RpcSerializer.

Push Notifications

If you have an existing instance of a ProtoRpcSerializer, it has two conversion methods (asPushNotificationSerializer, asPushNotificationDeserializer) that adapt the built in Protobuf handling to work for Push Notifications as well.

ProtoSerializerBuilder

This builder automatically registers serialization handlers for core Java primitives, common value types like java.util.Locale, and some core Ignition classes like BasicQualifiedValue. In addition, various methods are available to add custom serialization behavior:

Binary Adapters

/**
 * Registers an adapter capable of serializing and deserializing objects of the given class.
 * This is the most flexible serialization method, as it allows for custom serialization and deserialization logic -
 * you only need to provide your own logic to marshal to and from bytes.
 */
public <T> ProtoSerializerBuilder addBinaryAdapter(Class<T> clazz,
                                                   BinarySerializer<T> serializer,
                                                   BinaryDeserializer<T> deserializer) {}

@FunctionalInterface
public interface BinarySerializer<T> {
    byte[] encode(T any, SerializationContext context) throws ProtoSerializationException;
}

@FunctionalInterface
public interface BinaryDeserializer<T> {
    T decode(byte[] serialized, DeserializationContext context) throws ProtoSerializationException;
}

Proto Adapters

A subcase of binary adapters, intended as a slightly easier way to recycle existing Protobuf adapters inside the standard RPC payload envelope.

public <T, S extends Message> ProtoSerializerBuilder addProtoAdapter(
    Class<T> clazz,
    FragileFunction<byte[], S, InvalidProtocolBufferException> protoDeserializer,
    FragileFunction<T, S, Exception> serializer,
    FragileFunction<S, T, Exception> deserializer) {
}

GSON Adapters

You can also pass Gson adapters/serializers directly in to use them within the overall Proto envelope.

/**
 * Registers an adapter capable of serializing and deserializing objects of the given class.
 * This method will use the internal Gson instance to serialize and deserialize objects.
 */
public <T, S extends JsonSerializer<T> & JsonDeserializer<T>> ProtoSerializerBuilder addGsonAdapter(Class<T> clazz,
                                                                                                    S adapter) {

Which allows you to “recycle” any existing GSON serializers you may already have without having to write Protobuf alternatives.

Value Adapters

Value adapters are the lowest level primitive serializer; essentially all other types of adapters are just specialized forms of value serializer and deserializer. This allows you to wrap any arbitrarily complex object, as long as it can be encoded and decoded to primitives, much like Json/Gson adapters.

/**
 * Registers an adapter capable of serializing and deserializing objects of the given class.
 * "Primitive" serialization stores objects as a {@link Value}; akin to JsonElement but with strong types.
 */
public <T> ProtoSerializerBuilder addAdapter(Class<T> clazz,
                                             ValueSerializer<T> serializer,
                                             ValueDeserializer<T> deserializer) {}

@FunctionalInterface
public interface ValueSerializer<T> {
    Value encode(T any, SerializationContext serializationContext) throws ProtoSerializationException;
}

@FunctionalInterface
public interface ValueDeserializer<T> {
    T decode(Value serialized, DeserializationContext context) throws ProtoSerializationException;
}

Fallback Customization

The builder also exposes the ability to customize ‘fallback’ behavior, if you wish to avoid Gson’s implicit use of reflection to serialize and deserialize classes as a ‘last resort’. This is generally not recommended, as it’s very convenient for e.g. record classes, but available as an option.

/**
 * By default, the constructed serializer will fall back to a GSON reflection based approach to serialize values
 * without more specific adapters registered. If you want to use a different approach, you can do so by providing
 * serialization and deserialization methods appropriately.
 */
@SuppressWarnings("unchecked")
public ProtoSerializerBuilder withFallbackSerializationStrategy(ValueAdapter<?> adapter) {
    fallbackSerializer = (ValueSerializer<Object>) adapter;
    fallbackDeserializer = (ValueDeserializer<Object>) adapter;
    return this;
}

ObjectSerializers

ObjectSerializers is an “escape hatch” - throughout Ignition, there are areas where, for backwards compatibility, Java serialization is essentially unavoidable. While we will continue to tighten down these requirements over time, the ObjectSerializers utility exists as a bridge to allow the minimum possible surface area to be Java serialized as easily (and safely) as possible within a larger Protobuf envelope.
The typical entrypoint will be one of the two static methods on the class:

/**
 * Creates a new {@link SaferObjectProtoSerializer} for the given class.
 * This will use the {@link SaferObjectInputStream} and seed it with the default values along with the target class
 * and any additionally provided classes. <b>All other objects will cause deserialization failure</b>
 */
public static <T> SaferObjectProtoSerializer<T> forSafeObject(Class<T> targetClass, Class<?>... safeClasses) {
    return new SaferObjectProtoSerializer<>(targetClass, safeClasses);
}

/**
 * <b>DO NOT USE UNLESS LAST RESORT!</b><br> This uses default java serialization. the only time this should ever
 * be done is if we have basically no way to know what the object is or is going to be so we have no other choice.
 * <br><br>NOTE: This serialization  mechanism still uses the {@code ObjectInputFilter} which helps prevent
 * typical deserialization attacks.
 */
public static <T> UnsafeObjectProtoSerializer<T> forUnsafeObject(Class<T> clazz) {
    return new UnsafeObjectProtoSerializer<>(clazz);
}

Though, if absolutely necessary, nested classes are exposed that can be subclassed to provide further customization.

A Note On Nesting

The first-party RPC implementation is intentionally pragmatic, preferring to avoid rewriting code and having duplicate canonical serialization formats for first party classes as much as possible. As part of this rework, it’s possible to end up with complicated scenarios worth diving into more. For instance, the standard ProtoRpcSerializer construction uses com.inductiveautomation.ignition.common.rpc.proto.json.BasicQualifiedValueGsonAdapter to serialize BQVs.
In this case, the serialization of values will continue to be Java serialization for backwards compatibility, but it will be wrapped in a GSON adapter to be available for contextual serialization in other adapters.
Thus, the actual serialization process for a given BQV goes as follows:

  1. The ‘value’ inside the BQV is Java serialized into a byte[].
  2. That byte[] is encoded by the GSON context into a base64 string (since JSON has no support for binary data)
  3. The base64-encoded string is embedded into a JSON object as the value under a “value” key.
  4. ProtoRpcSerializer adapts that JSON object into a ValueCollection so that it can be Protobuf encoded.
  5. The Protobuf message is written out over the HTTP connection as binary bytes.

This nesting is unfortunately convoluted, but keeps the developer experience clean and achieves satisfactory performance and payload size over the wire. The deserialization process is essentially the exact inverse, reversing the steps to achieve a plain Java object inside a BQV at the other end.

Push Notifications

In broad strokes, the Push Notification API (used to send messages from the gateway down to clients/designers on demand) appears about the same to API consumers, with one significant visible exception: You must provide serialization and deserialization functions; push notifications are no longer implicitly Java serialized (you are, of course, welcome to Java serialize them yourself if you so choose).
The backing technology has changed; the client/designer now maintains a persistent websocket connection back to the gateway (independent of any Perspective session websocket), and your push notifications will be sent as individual messages over that websocket. Any payload you specify will be sent as opaque bytes (encoded via your provided serializer) and your PushNotificationListener on the receiving side must register a deserialization function that can turn those bytes back into a useful Java object for your listener to consume.

ClientReqSession.addPushNotification

The basic entry point to sending a push notification (see also GatewaySessionManager.sendNotification). The signature is the same as prior versions of Ignition, with the addition of a trailing serializer parameter that is required to accept your type T and write it to a stream. If you do not have any message to send, and the act of a notification is sufficient, you can call the reduced args overload and are not required to pass a serializer.

PushNotificationSerializer/PushNotificationDeserializer

Much like the general RPC ClientRpcSerializer/GatewayRpcSerializer, this pair of mirrored interfaces represents the act of encoding some arbitrary message payload into binary, and the inverse operation on the client side upon receipt.
Both classes have helpful static utility methods that send payloads as UTF-8 encoded strings, in case you already have a string based serialization format available (such as GSON).

PushNotificationListener

On the receiving side, the PushNotificationListener interface changed significantly. It is no longer possible to register a single listener for multiple message types - each listener is strictly tied to a single moduleId, typeId pair.

public interface PushNotificationListener<T> {
    /**
     * The module id this instance should be registered to.
     */
    String moduleId();

    /**
     * The message type this instance should be registered to.
     */
    String messageType();

    /**
     * A callback to be invoked when a notification is received.
     * Will be run on a queue owned by the local scope's {@link GatewayConnection} instance.
     */
    void receiveNotification(T notification);
}

You are of course able to implement this interface yourself, but the vast majority of callers can switch to using the new static factory method on the interface:

    /**
     * Creates a new PushNotificationListener that invokes the provided listener on the {@link EventQueue EDT}.
     *
     * @param moduleId The module id to listen to.
     * @param messageType The message type to listen to.
     * @param listener The listener to invoke when a notification is received.
     */
    static <U> PushNotificationListener<U> of(
        String moduleId,
        String messageType,
        Consumer<U> listener
    );

This allows you to pass in a simple handling lambda that accepts your custom notification type. Users interested in further customization may also consider subclassing AbstractPushNotificationListener instead of the interface.

GatewayConnection.addPushNotificationListener

In any event, the API to add push notification listeners is approximately the same, mirroring the broadcasting functions on the session:

    /**
     * Adds a handler for push notifications.
     * The provided deserializer is used to convert incoming raw binary bytes into your handler's expected message type.
     */
    <T> void addPushNotificationListener(PushNotificationListener<T> listener,
                                         PushNotificationDeserializer<T> deserializer);

    /**
     * Adds a handler for empty push notifications, where the message itself is the only signal.
     */
    default void addPushNotificationListener(PushNotificationListener<Void> listener) {
        addPushNotificationListener(listener, null);
    }

Obtain an instance of GatewayConnection appropriate to your current execution scope using the standard singleton getter:
GatewayConnectionManager.getInstance().addPushNotificationListener(listener, deserializer)

Basic Migration Workflow

Modules

  1. Create one or more Java interfaces defining your RPC methods in a common scope of your module code (so that it’s reachable from the gateway and client/designer). Annotate this interface with @RpcInterface.
  2. Create a concrete implementation of this interface in the gateway scope of your module.
    1. Note: There are some static accessors on RpcDelegate to provide access to members of RpcContext within your function implementations.
  3. Override getRpcImplementation() in your module’s gateway hook to return a custom implementation, which should:
    1. Return a module-specific GatewayRpcSerializer (more on that later).
    2. Return an RpcRouter (often this is just an AggregateRpcRouter).
    3. Register a package name (can be whatever you want for your module for this rpc) and a new instance of RpcDelegate which wraps the instance from step 2 of the class which implements your arbitrary RPC java interface.
  4. Your DesignerHook/ClientModuleHook should obtain an instance of your RPC interface to be used for RPC calls to the gateway. The following example instantiates an RPC on the designer side for the PerspectiveModuleRpc Java interface:
PerspectiveModuleRpc rpc = GatewayConnectionManager.getInstance().getRpcInterface(
    ProtoRpcSerializer.DEFAULT_INSTANCE,    // generic serializer
    PerspectiveModule.MODULE_ID,            // the module, for locating
    PerspectiveModuleRpc.class,             // the RPC Java interface
    120_000                                 // timeout (ms)
);
  1. Note: this uses the default instance of ProtoRpcSerializer, but it could be any serializer. If you have common objects in your module which you pass back and forth you should define a serializer in your common scope of the module, or as a static field on the RPC interface, so it can be used on the gateway as well as the client/designer scope.
  2. Replace all calls using ModuleRPCFactory.create with get() calls to your RPC instance
  3. Audit the RPC interface to make sure you have all of the Object types serializable via GSON or Protobuf, and they are available to the Serializer you instantiate. If these are platform level objects, add them to the ProtoSerializerBuilder, if module-specific, add it to your own subclass of the serializers which implement GatewayRpcSerializer and ClientRpcSerializer

Platform

Essentially the same as modules, but register your additional RPC implementations in IgnitionGateway#createRpcImplementation.

Best Practices

  1. Define shared interface(s) in common. This means you can’t implicitly rely on a GatewayContext and can make your RPC interfaces “clean”.
    1. Be explicitly aware of the types you are sending over RPC. Whenever possible, reduce communication to ‘primitives’ or Java platform classes, for easier handling on both sides.
  2. Use these interfaces directly, via the proxying mechanism and indirection. Only directly implement RpcHandler and RpcRouter if you absolutely have to.
  3. Minimize construction of RPC proxy instances (GatewayConnection.getRpcInterface) - ideally, obtain a single instance locally in whatever GUI class needs them, or store them as fields in your module’s local hook, etc, because the proxy instantian overhead is not trivial. These instances can be safely retrieved and cached, preferably as static fields. The internal proxy invocation handler always retrieves the current gateway connection, so there’s no risk of accidentally invoking a method on an old/stale connection.
  4. If you’re migrating a scripting function implementation that uses PyObjects directly, sending or receiving, strongly prefer migrating to something else. Use plain Java classes, ideally “primitives”, and only “hydrate” into a Python object in the local scope when necessary.
  5. In general, avoid sending datasets directly over RPC. Datasets are fundamentally awkward to work with from Java due to their total lack of compile time safety. If you need to return a dataset for backwards compatibility reasons (say, in a scripting function implementation), consider returning a strongly typed list of record objects from your RPC interface, and “adapting” that to a dataset in the local scope. Look at com.inductiveautomation.ignition.common.rpc.impl.SwingRpc.SwingSessionInfo for an example of this pattern.
  6. RPC interface calls are going to block whatever thread you use to call them. Standard Swing programming advice (use a SwingWorker or other mechanism to move off the EDT) applies.
    1. Consider using the new Task class (com.inductiveautomation.ignition.client.util.gui.progress.Task), especially if you need to bridge an older call that implicitly blocks the GUI more easily.
  7. If you choose to opt in to the mutability mode capability restrictions, consider using the new static methods on com.inductiveautomation.ignition.client.util.gui.ReadWriteOptionDialog to encapsulate required elevation operations via a pattern end users will be familiar with.
  8. GSON vs Protobuf - Which do I use?
    1. The answer here is: it depends. But really, it's whatever is the easiest migration path. If a GSON serializer already exists for that class, use it. If it doesn’t, and the only conceivable serialization case is for RPC, then you can either reach toward Protobuf or provide a custom RPC specialized adapter like ValueAdapter to encode the payload efficiently without adding extra code generation. If this is a base Java class, in general prefer Protobuf. If it's in a third party, we should be wrapping this in a delegate anyways, probably a record, then decide.
2 Likes

I was using several RPC interface in 8.1 with:

GatewayHook

@Override
public Object getRPCHandler(ClientReqSession session, String projectName){
    return new GatewayRPCHandler(this.context,...);
}

GatewayRPCHandler

public class GatewayRPCHandler implements interface1RPC,interface2RPC{
}

Is it possible in 8.3 with:

GatewayHook

@Override
public Optional<GatewayRpcImplementation> getRpcImplementation() {
    return Optional.of(GatewayRpcImplementation.of(
            interface1RPC.SERIALIZER,interface2RPC.SERIALIZER,
            new GatewayRPCHandler(this.context))
    );
}

GatewayRPCHandler

public class GatewayRPCHandler implements interface1RPC,interface2RPC{

I’m not sure It’s the right use of GatewayRpcImplementation.of

If you actually need two distinct serializers for your different RPC interfaces, then use com.inductiveautomation.ignition.gateway.rpc.GatewayRpcImplementation#newBuilder to create a new Builder instance which allows you to fluently add interfaces with their own serializers attached:

Note that this is just some simple plumbing around, essentially, a Map<String, Handler> that does the routing by package ID. Nothing's stopping you from coming up with your own routing/handling strategy; the factory methods are just for convenience. See also com.inductiveautomation.ignition.gateway.rpc.CompositeSerializer and its associated builder.

You're able to do this (provide a single implementation class that implements multiple RPC interfaces) in the new RPC, but I personally find it unnecessary.

If you truly need multiple RPC implementations and independent serializers for each, and you want to take advantage of our first party niceties as much as possible, it would be something like this:

Common

interface RpcInterface1 {
	// methods
}

interface RpcInterface2 {
	// methods

	static RpcSerializer CUSTOM_SERIALIZER = ProtoRpcSerializer.newBuilder().build();
}

Gateway

class RpcInterface1Impl implements RpcInterface1 {
	public RpcInterface1Impl(GatewayContext context);

	// methods
}

class RpcInterface2Impl implements RpcInterface2 {
	public RpcInterface2Impl();

	// methods
}

class GatewayHook {
	Optional<GatewayRpcImplementation> getRpcImplementation() {
		return Optional.of(
			GatewayRpcImplementation.newBuilder(ProtoRpcSerializer.DEFAULT_INSTANCE)
				.addInterface(new RpcInterface1Impl(context))
				.addInterface(new RpcInterface2Impl(), RpcInterface2.SERIALIZER)
				.build()
		);
	}
}

Designer/Client


class DesignerHook {
	public static final RpcInterface1 INTERFACE_1 = GatewayConnection.getRpcInterface(
		ProtoRpcSerializer.DEFAULT_INSTANCE,
		MY_MODULE_ID,
		RpcInterface1.class,
		60000 // timeout
	);

	public static final RpcInterface2 INTERFACE_2 = GatewayConnection.getRpcInterface(
		RpcInterface2.SERIALIZER,
		MY_MODULE_ID,
		RpcInterface2.class,
		60000 // timeout
	);
}
1 Like