Websocket Disconnected Because DataFrame/Message Too Large with Perspective


I am receiving this error in the logs -

Websocket connection closed unexpectedly. 
reason=Resulting message size [2102265] is too large for configured max of [2097152], codeMeaning=Message Too Big, 
codeDescription=The endpoint is terminating the connection because a data frame was received that is too large.

I am working with some large dictionary structures, and passing them as parameters across different embedded views. I'm optimizing my strcuts to decrease the size of data. I have a couple of questions.

Would I be better to use a dataset or a JSON encoded dictionary to pass parameters instead of a Perspective Object? Maybe the best question here is what is the most data-friendly structure to use (lists vs datasets vs objects vs JSON-encoded objects/lists)

Lastly, where can I increase the max message size, and is this a bad idea to do so?



Datasets are better, generally, because they have an optimization throughout of not needing to render (and listen for changes on) all inner members.

Making properties private also helps a lot, though I don't know if you can do that for view parameters. I would consider different approaches; can you 'batch process' this data entirely on the gateway so you don't have to send so much between views at all?

1 Like

I've been getting this issue whenever a filter is applied to an alarm journal, unless the date range being viewed is small (a few weeks). I'm afraid to allow the toolbar to be visible in case someone applies an Active filter to what would be a beginning of time view of all historical alarms. I've tried changing the max data frame size, but then it always tries to send something just slightly bigger than the new maximum and disconnects the clients.

I ended up having to go pretty big on that max message size for some of the larger datasets that have tens of thousands or hundreds of thousands of rows for Perspective

I forgot to follow up on the issue here. The solution to my particular problem was to not use the native filter, but to instead filter using the filterAlarm extended function, which is a lot faster:

return "Active" == str(alarmEvent.get('EventState'))
1 Like