I am thinking about using redis to exchange data, but when importing redis to the pylib, it has error when running it.
It might be a big amount of data exchanging at ms speed with multiple clients accessing it.
I can use opcua, but it's just not a neat way.
or any other nosql database I can use?
Tried Jedis, redis-python, none of them work on ignition.
I spent a few hours to try out the Redis solution, still not working from Ignition side.
Then I moved on to setup a Sparkplug MQTT client to talk to the Ignition gateway, it's ok to pub/sub some tags, but the amount of work will increase tremendously if I want to share hundreds of tags, including JSON files and UDT instances, between Ignition and external python files.
Just wonder what could be the easiest and neat way of exchange a few hundred data between ignition and external python file?
Any suggestion is welcomed.
WebDev. Make an API that python can call.
Or, may an API with python that Ignition can call.
I was using web dev api mainly for the front end React communication
I will have a try to use python script to call the API.
Meanwhile trying the new MongoDB module, but somehow still cannot get the local MongoDB setup correctly in the Connectors page.
Just to update:
The mongo db is working fine so far exchanging data between ignition and docker python files.
I chatted with a IT engineer who is working on Real Time Operating System design, he said, no matter how fancy Redis/MongoDB/Other DB are, the local binary file read/write is always the easiest and fastest way to exchange data between different applications within the same PC.
Not sure whether anyone has different opinion to it?
Cheers
I would say shared memory regions with lockless ring buffers. Basically the technology used for virtual machine host-guest virtio drivers.
1 Like
Thanks for sharing the experience.
Ring buffer is new to me, need a bit time to research.