Dataset Parallel writing Collision

Is there are a way to update dataset without rewriting it?

I have a data set as memory tag. And i have 2 sources which doing updates in it (on change scripts), I want to update specific cell in dataset etc., but if 2 sources will writing to dataset in the same moment of time - the data from one of the sources will be lost because system.dataset.setValue is not updating dataset - it returning new dataset with updated cell. Script is using new dataset to rewrite the source dataset. So there will be scenario when 2 sources modifying the same dataset - the data be lost. I tested this scenario and the issue is present. Hope i explained well.
So is somebody has any idea how to made modification to data set with not rewriting whole dataset?

Datasets are immutable, so no. Consider either moving this to the database, or combining your scripts so that you are writing to the tag once.

Instead of writing directly to the dataset, have each possible writer call a gateway message handler, where the actual dataset update is done. Individual handlers are serialized–no parallel writes will happen.

Thanks for the reply. I understand that possible solution is to have only 1 script which updating/writing to a data set, but that’s not what i am looking for. I replaced dataset to the string array - works better.

Have a nice day!

While there’s nothing in place to stop you from mutating a String array it’s not really supported and may have unintended consequences, like tag change scripts not seeing the mutation.

Using String arrays correctly you would suffer the same problem as Dataset.

True, but it depends how use it. For my goals all looks good.
I have 1 source which adding data to empty fields (cells) of array.
And another script which deleting data after required operations with them.

So i have 2 script units which working with the same String array, but they working all the time with different array cells. And i can update only 1 cell - not the whole array as in dataset case.