Integration Toolkit Solutions Wiki

By popular request, I've created this topic to be used in wiki mode for collaborative generation of solutions to particular problems. To keep this neat, please follow these guidelines:

  • New problems seeking solutions should be posted as replies, and the discussion of that problem kept within that comment, wiki-style. A one-line summary should be added to this topic starter, with an embedded link (not on its own line). After posting, be sure to convert the comment to operate in wiki mode.

  • Use the "Hide Details" tool to keep the one comment neat. In particular, code blocks belong inside the details.

  • Create a "uses" details section at the end of a problem/solution comment where you deep link into the public module documentation for each function or feature used in the solution. Consider also deep linking into the IA manual, too.

Public Integration Toolkit Overview

Ignition Native Scripting Functions, By Scope

Ignition Native Expression Functions

Use the Integration Toolkit wiki - setup discussion thread for discussion about this wiki.

{ Please add, within reason, to these best practices. }

Problem Index

Link your comments below here, as bullets, with a one-line description as the link text.

20 Likes

Perspective Table Data and Columns Config from Datasets

When using the JSON return format in a Named Query, the column order and original column datatypes are lost. You can move a Named Query binding to a custom property of the component or view, then use this simple iteration expression to deliver the jsonified content to the table:

Binding on props.data
forEach(
	{path.to.source.dataset.prop},
	asMap(it())
)
Binding on props.columns (optional)
forEach(
	columnsOf({path.to.source.dataset.prop}),
	if(
		it()[0] = 'special_column_name',
		asMap(
			'field', it()[0],
			'sortable', true,
			'editable', true,
			'filter', asMap('enabled', true, 'visible', 'on-hover'),
			'resizable', true //,
	//		'header', asMap('title', replace(runScript(concat("'",it()[0],"'",'.title()')),'_',' ')), // don't do this
	//		'render', it()[1] // not necessary (ds column types are implicit)
	//		add additional properties as needed
		),
		asMap('field', it()[0])
	)
)

@pturmel - can we get a title() function?

Features employed
  • Iterables General behavior of all iterators.
  • forEach() Loops through the dataset, calling the nested expression with one row at a time.
  • it() Delivers the one row in dataset format (same column names and types as the source).
  • asMap() Converts the first row of the dataset it is given into a mapping (dictionary) of key-value pairs.
  • columnsOf(): gets column names (and types)
4 Likes

Creating a custom property dataset using a temporary binding

There isn't a built-in method to create a dataset in a Perspective custom property. Creating an Expression Binding with this script will generate one. The binding can then be removed leaving the dataset.

Expression binding on custom property
unionAll(
	asMap(
		"someDate", "date",
		"someBool", "B",
		"someInt", "I",
		"someString", "str"
	),
	asList() // No row data.
)
/* Returns
"#NAMES"
"someDate","someBool","someInt","someString"
"#TYPES"
"date","B","I","str"
"#ROWS","0"
*/
Features employed
  • unionAll() Assembles an output dataset from scratch, using the given column names and types (internally via a DatasetBuilder), performing a UNION ALL with each row source.
  • asList() unconditionally assembles all of its arguments into a List.

You really only need to create one column with the binding and then modify the result with the Dataset Editor. (Click the icon to the right of the Dataset [...].)

1 Like

Merging lists into a single list of objects

How would I merge multiple lists into a single list of objects/dicts?
e.g.

a = [1,2,3,4]
b = ['a','b','c','d']
c = {'a': 1, 'b': 2, ...} # create this one

Iterate over one while picking from the other. But use transform() to snapshot for use within the iterator.

Emulation of dict(zip(b, a))
transform(
	{path.to.list.a},
	asMap(
		forEach(
			{path.to.list.b},
			it(),
			value()[idx()]
		)
	)
)

Note that the toolkit's asPairs() function works like python's .extend(), not zip().

You can shuffle the order of operations to get various output types.

Features Employed
  • transform() Makes a snapshot of list a.
  • Iterables General behavior of all iterators.
  • forEach() Loops through the list of keys, calling the nested expressions with one key at a time, yielding a list of pairs.
  • it() Delivers the key from list b.
  • value() Retrieves the snapshot of list a.
  • idx() Retrieves the zero-based loop index to use as a subscript into list a.
  • asMap() Creates a map of the list of key-value pairs.

{ Someone forgot to make their question a wiki.... }

5 Likes

Filtering datasets and emulating the IN operator

Many times you want to filter a dataset where a columns value is equal to one of any item in a list. In SQL this is possible by utilizing the 'IN' operator. However, outside of using a QueryString parameter, due to the way that Named Query parameters work you can not supply a List of values.

If you create a custom property to hold the raw data set from the named query, then you can use a simple expression binding to filter the dataset further in a similar manner to that seen when using the SQL IN operator.

Filtering by a List of Values
unionAll(
    columnsOf({path.to.raw.data}),
    forEach(
        {path.to.List},
        where(
            {path.to.raw.data},
            it()['ColumnToFilter'] = it(1)
        )
    )
)

Note that for filtering by a single value, only the where() expression is needed.

Filtering by a Single Value
where(
    {path.to.raw.data},
    it()['ColumnToFilter'] = {path.to.filter.value}
)
Features employed
  • Iterables General behavior of all iterators.
  • unionAll() Assembles the list of filtered rows sources into a new filtered dataset.
  • columnsOf() Returns an ordered map of the column names versus column type class names (as strings).
  • forEach() Loops through the list of items, for comparison in the conditional argument of the where() expression.
  • where() Prunes the source data according to supplied conditional expressions.
  • it([depth]) Delivers the current iterations value based on the depth argument supplied.

Note that for static lists, the asList() expression can be used in place of {path.to.list}

2 Likes

Add Conditional Styling to Perspective Table Data Rows

Add conditional styling, such as font color, background color, etc. based on a cell value

Conditional Row Background Color
forEach(
	{this.custom.data},
	asMap(
		'value', it(), // use asMap(it()) for dataset sources
		'style', asMap('backgroundColor', if(it()['population'] > 1000000, 'red', ''))
	)
)
Features employed

Note that for dataset sources, each row needs to use asMap() for the value key (see expression comments).

2 Likes

Merging lists into a single list of objects where each list is represented by its own key in each object

How would I merge multiple lists into a single list of objects/dicts, where each list is given its own key name?
e.g.

BobList = [1,2,3,4]
PeterList = ['a','b','c','d']
MergedList = [{'Bob': 1, 'Peter': 'a':}, {'Bob': 2, 'Peter': 'b'} ...] # create this one

I want to be able to name the keys explicitly, they won't match the list name exactly


Solutions

Requirements:

  • The number of supplied keys is equal to the number of lists to combine
  • The lists being combined are the same length
  • The keys should all be unique, otherwise the last lists values associated with that key will be used.

Simple solution by @Transistor

Expression binding
forEach(
	asList(1, 2, 3, 4),
	asMap(
		'Bob', it(),
		'Peter', asList('a','b','c','d')[idx()] 
	)
)
or

forEach(
	{this.custom.BobList},
	asMap(
		'Bob', it(),
		'Peter', {this.custom.PeterList}[idx()] 
	)
)

Note: If length of BobList < length of PeterList the expression will return null.

Advanced solution by @lrose

Use transform() to snapshot a list of lists (Lists to be combined), then use forEach() in it's range form to "enumerate" the lists. Finally, use asMap() and forEach() to loop over the list of keys, and return a list of lists where each nested list is a key, value pair for the map.

Expression binding
transform(
    asList(
        asList(1,2,3,4),
        asList('a','b','c','d'),
    ),
    forEach(
        len(value()[0]), //acts like pythons range()
        asMap(
            forEach(
                asList('Bob','Peter'),
                asList(
                    it(),value()[idx()][idx(1)]
                )
            )
        )
    )
)

or

transform(
    {path.to.list.of.lists},
    forEach(
        len(value()[0]),
        asMap(
            forEach(
                {path.to.list.of.keys},
                asList(
                    it(),value()[idx()][idx(1)]
                )
            )
        )
    )
)
Advantages

The main advantage of this approach over the one provided by @Transistor is it allows for any arbitrary number of List/Key combinations, with out further editing of the expression. This means that the expression will work even if the expression parameters are produced dynamically, so long as the requirements for the function to work are met.

Using transform() allows for a single reference to the List of Lists, meaning it can be referenced multiple times throughout the expression with out needing to do the work of dereferencing the property path to get the value. Making the expression more performant, especially as the number of list/key combinations grows.

Some clarification on the inner most forEach() loop.

In this expression it(), value()[idx()][idx(1)], it() is the current key, value() is the list of lists, [idx()] is the current index of the key used to retrieve the list associated with that key from the lists of lists, [idx(1)] is the value in the associated list at the "enumerated" index.

So the first time through the loop, it would return [['Bob',1],['Peter','a']], which when handed to asMap() results in the map {'Bob':1,'Peter':'a'}.

Functions used
  • transform() Makes a snapshot of the list of lists.
  • forEach() loops through the items in the source.
  • asMap() produce a Java Map with string keys.
  • asList() unconditionally assembles all of its arguments into a List.
  • len() returns the length of its argument.
  • it() Delivers the key from the lists of keys
  • idx([depth]) retrieves the loop index for an index for the forEach iterator based on the value of its depth argument.
2 Likes

Filter table data using where()

A very simple example of using the where() filter on the Perspective table component's default data.

Expression binding on custom property
where(
	{this.props.data},
	it()['country']  = 'China'  // case sensitive.
)
Features employed
  • where() prunes the source data according to supplied conditional expressions.
  • it() retrieves the loop value for an iterator from inside the nested expression.
1 Like

Incremental readings from totalising meter using asMap() and lag()

Useful for extracting the difference in periodic meter readings.

Sample dataset
"#NAMES"
"t_stamp","reading"
"#TYPES"
"date","I"
"#ROWS","6"
"2025-04-17 00:00:00.000","5"
"2025-04-17 01:00:00.000","7"
"2025-04-17 02:00:00.000","11"
"2025-04-17 03:00:00.000","17"
"2025-04-17 04:00:00.000","25"
"2025-04-17 05:00:00.000","35"

or paste the JSON below onto a component's custom.data.

[
  {"t_stamp": 1744844400000, "reading": 5 },
  {"t_stamp": 1744848000000, "reading": 7},
  {"t_stamp": 1744851600000, "reading": 11},
  {"t_stamp": 1744855200000, "reading": 17},
  {"t_stamp": 1744858800000, "reading": 25},
  {"t_stamp": 1744862400000, "reading": 35}
]
Output
[
  {"t_stamp":1744844400000,"reading": null},
  {"t_stamp":1744848000000,"reading": 2},
  {"t_stamp":1744851600000,"reading": 4},
  {"t_stamp":1744855200000,"reading": 6},
  {"t_stamp":1744858800000,"reading": 8},
  {"t_stamp":1744862400000,"reading": 10}
]
Expression binding

e.g., on Perspective custom property:

forEach(
	{this.custom.data},
	asMap(
		't_stamp', it()['t_stamp'], 
		'Increment', it()['reading'] - lag()['reading']
	)
)
Functions employed
  • forEach() loops through the items in the source.
  • asMap() produce a Java Map with string keys.
  • it() retrieves the loop value for the iterator.
  • lag() retrieves the previous loop value for the iterator. It returns null for the first pass and so the incremental reading will be null for the first interval. (The null in calculation seems to be neatly handled by the asMap() function which returns a null.)
2 Likes

Simple orderBy() example

Reorder the given source data by key.

Source data

Paste this into a custom property, unsorted.

[
	{"fruit": "apples",  "count": 10},
	{"fruit": "bananas", "count": 8},
	{"fruit": "citrus",  "count": 11},
	{"fruit": "dates",   "count": 7}
]
Expression binding

Create this binding on another custom property. e.g., sorted.

orderBy(
	{this.custom.unsorted},
	it()['count']
)
Result
[
  {"fruit": "dates",   "count": 7},
  {"fruit": "bananas", "count": 8},
  {"fruit": "apples",  "count": 10},
  {"fruit": "citrus",  "count": 11}
]

descending() example

Expression binding
orderBy(
	{this.custom.unsorted},
	descending(
		it()['count']
	)
)
Result
[
  {"fruit": "citrus",  "count": 11},
  {"fruit": "apples",  "count": 10},
  {"fruit": "bananas", "count": 8},
  {"fruit": "dates",   "count": 7}
]

naturalOrder() example

Case-insensitive sort.

Expression binding
orderBy(
	{this.custom.unsorted},
	naturalOrder(
		it()['code']
	)
)
Source data
[
  {"fruit": "apples",  "code": "a10"},
  {"fruit": "bananas", "code": "8b"},
  {"fruit": "citrus",  "code": "A10"},
  {"fruit": "dates",   "code": 7}
]
Result
[
  {"code": 7,     "fruit": "dates"},
  {"code": "8b",  "fruit": "bananas"},
  {"code": "a10", "fruit": "apples"},
  {"code": "A10", "fruit": "citrus"}
]

Note that a10 is output before A10.

naturalCasedOrder() example

Case-sensitive sort.

Expression binding
orderBy(
	{this.custom.unsorted},
	naturalCasedOrder(
		it()['code']
	)
)
Result
[
  {"code": 7,     "fruit": "dates"},
  {"code": "8b",  "fruit": "bananas"},
  {"code": "A10", "fruit": "citrus"}.
  {"code": "a10", "fruit": "apples"}
]

Note that A10 is output before a10.


Functions used
  • orderBy() reorders the given source data per the given key expressions.
  • it() retrieves the loop value for an iterator from inside the nested expression.
  • descending() reverses the comparison order of any comparable handed to it.
  • naturalOrder()
  • naturalCasedOrder() orders in case-sensitive alphanumeric.
3 Likes

Get the row index of a dataset/list matching criteria - Matches lookup function but returns row index

I want to replicate the functionality of lookup(...) except instead of returning a cell value, I want to return the index of the found row.

Wrap a forEach() inside where() to include the original index in the output.

Index Lookup
where(
	forEach( // Expand the source into pairs of original index and content there
		{path.to.source},
		idx(),
		it()
	),
	it()[1]['someColumn'] = 'SomeValue' // original content is the 2nd element of the pair
)[0][0] // Extract first match, then original index is the 1st element of the pair

Note that most applications that need the index need it to retrieve multiple parts of the found row. It is better to simply use where() to return the entire first matching result. { I want to use this for the scrollIntoView JS function which needs the row index }

If the source is sure to be a dataset, consider using selectStar() to add a new column holding the original index instead of making pairs. Something like this:

Dataset Row Lookup
where(
	where(
		selectStar( // Expand the source to include an original index column
			{path.to.source.dataset},
			asMap('_original_idx', I),
			idx()
		),
		it()['someColumn'] = 'SomeValue' // any ordinary condition
	),
	idx() < 1 // Extract first row
)
Functions Used

timeMe() expression execution timer

[I've reached the limit of two consecutive posts again so posting this here temporarily until someone else posts something and then I can post it seperately.]

This function injects an MDC1 filter key into the logging system during expression execution, allowing a specific expression’s activity to be easily located in the logs.
1 Mapped Diagnostic Context.

Expression
timeMe(
	'timeMeLog', // string to be returned with the MDC.
	forEach(     // sample expression to be timed.
		asList(0, 1, 2, 3, 4, 5, 6, 7, 8, 9),
		asMap(
			'Number', it(),
			'Square', it() ^ 2
		)
	)
)

Results available in gateway web page → Status → Logs.
Note the MDC string from the script.

Functions used

alias()


custom.dataset.

alias({this.custom.myDataset}, 'new_')
results in this:

It just works! "Ideal for use with the various JOIN operations below to avoid column name clashes."

alias().

columnsOf()

columnsOf({this.custom.myDataset}) returns

{
  "someBool": "B",
  "someDate": "date",
  "someString": "str",
  "someInt": "I"
}
`
2 Likes

toTransient() and nonTransient()

Can someone edit this to give examples of what these functions are supposed to do. I seem to get back the original dataset.


That is essentially what they do, from the surface. A little deeper, they change whether the content of the dataset, is serializable or not. The content in TransientDatasets is not serializable, meaning they are not persistent. I would need to rely on Phil or perhaps someone from IA to say what expressions return a TransientDataset. The exception being the recorder() expression from the Tool Kit, which is noted as returning one.

I personally have not come up with a need for these two expressions, but I am sure that Phil saw a need for them and so they exist.

@lrose, thanks. What exactly does 'serializable' mean in this context?

@Transistor It means that the class implements the Serializable interface. Which tells the JVM that the state of an object can be serialized into and consequently deserialized from a byte stream. Which means the state of the object can persisted for later use.

jsonDecode()

Takes a JSON string and converts it into a Python object such as a list or a dictionary. If the input is not valid JSON, a string is returned.

jsonDecode("{'item': [10, 20, 30, 40, 50]}")
// returns  {"item": [10, 20, 30, 40, 50]}

jsonDecode("{'item': [10, 20, 30, 40, 50]}")['item']
// returns           [10, 20, 30, 40, 50]

jsonDecode("{'item': [10, 20, 30, 40, 50]}")['item'][1]
// returns                20

jsonEncode()

Can someone edit this to give a working example and explain why this (example from system.util.jsonEncode()) doesn't work with the Integration Toolkit?

jsonEncode({view.custom.employees})
[
  {
    "value": {
      "firstName": {
        "value": "John",
        "quality": {
          "code": 0
        },
        "timestamp": "Apr 26, 2025, 12:01:14 PM"
      },
      "lastName": {
        "value": "Doe",
        "quality": {
          "code": 0
        },
        "timestamp": "Apr 26, 2025, 12:01:14 PM"
      }
    },
    "quality": {
      "code": 0
    },
    "timestamp": "Apr 26, 2025, 12:01:14 PM"
  },
  {
    "value": {
      "firstName": {
        "value": "Anna",
        "quality": {
          "code": 0
        },
        "timestamp": "Apr 26, 2025, 12:01:14 PM"
      },
      "lastName": {
        "value": "Smith",
        "quality": {
          "code": 0
        },
        "timestamp": "Apr 26, 2025, 12:01:14 PM"
      }
    },
    "quality": {
      "code": 0
    },
    "timestamp": "Apr 26, 2025, 12:01:14 PM"
  },
  {
    "value": {
      "firstName": {
        "value": "Peter",
        "quality": {
          "code": 0
        },
        "timestamp": "Apr 26, 2025, 12:01:14 PM"
      },
      "lastName": {
        "value": "Jones",
        "quality": {
          "code": 0
        },
        "timestamp": "Apr 26, 2025, 12:01:14 PM"
      }
    },
    "quality": {
      "code": 0
    },
    "timestamp": "Apr 26, 2025, 12:01:14 PM"
  }
]

The custom.employees is a simple array of firstname, lastname dictionaries. Why does jsonEncode add quality and timestamp keys and values?

jsonEncode(
	{"employees": [{"firstName": "John", "lastName": "Doe"}, {"firstName": "Anna", "lastName": "Smith"}, {"firstName": "Peter", "lastName": "Jones"}]}
)

Why does this give a "Nested paths not allowed (Line 2, Char. 17)" error?
Perspective makes everything a qualified value when assigning to its properties. Including values in arrays and dictionaries. Try jsonEncode(unQualify(...)).

parsePath().toString() equivalent? :grimacing:

How do i

from parsePath()?

Forgive my lack of familiarity if there’s an easy way - I try to avoid datasets unless absolutely necessary. (e.g. Table props.columns derivation).

Perhaps a "ds vs object" blurb might be useful if my ignorance is showing.

here's my uuugly method (first try)
{
  "type": "expr",
  "config": {
    "expression": "parsePath(\r\n\tconcat(\r\n\t\t{view.custom.node.meta_node.Tag_Path}, // concat(\u0027[\u0027, {session.custom.project.tag.provider}, \u0027]\u0027), \u0027\u0027),\r\n\t\t\"/Analog Tags/\",\r\n\t\tif(\r\n\t\t\t{view.custom.Type_ID.meta} \u003d \u0027Wl\u0027,\r\n\t\t\t\u0027TEMPERATURE\u0027,\r\n\t\t\t\u0027METER_TEMP\u0027\r\n\t\t)\r\n\t)\r\n)"
  },
  "transforms": [
    {
      "expression": "flatten(forEach(\r\n\t{value},\r\n\tasPairs(it())\r\n))",
      "type": "expression"
    },
    {
      "expression": "where(\r\n\t{value},\r\n\t!isNull(it()[1])\r\n)",
      "type": "expression"
    },
    {
      "expression": "forEach(\r\n\t{value},\r\n\tlower(groupConcat(it(), \u0027:\u0027))\r\n)",
      "type": "expression"
    },
    {
      "expression": "groupConcat({value}, \u0027:/\u0027)",
      "type": "expression"
    }
  ]
}

TL/DR: Use parsePath() in conjunction with tags() reading the storage provider property of the tags. Tags will return a fully de-relativized path, that can then be combined with the value returned from tags and a separately provided system name to construct the historical path.

Expression
transform(
	{path.to.gateway.name},
	forEach(
		tags(
			forEach(
				{path.to.list.of.tag.paths},
				concat(it(),'.historyProvider')
			)
		),
		transform(
			parsePath(it()[0]),
			concat(
				'histprov:',it()[1],  //gets value returned from tags.
				':/drv:',coalesce(value()['drv'],value(1)), //if tags is supplied a path with a driver then use that, otherwise use the provided value
				':',value()['prov'],
				':/tag:',value()['tag']
			)
		)
	)	
)

To produce a proper Historical Path given a simple tag path, you need to know a few things.

  1. The Historical Provider that is configured for the given tag path.
  2. The name of the gateway the Historical Provider is on. (If a single gateway system, then it's just the gateway name. For remote providers you need the gateway name and the tag provider on the remote gateway.

A historical tag path looks like this:

histprov:hist_prov_name:/drv:gateway_name:tag_provider_name:/tag:tag_path

This then of course begs the question of how you can dynamically retrieve that information.

In a single gateway system it's pretty simple. There is a System Tag SystemName that will give you the gateway name, and you can actually read the historyProvider property from the tag, given the tag path. So the first is a simple tag binding, and the second is a simple Indirect Tag Binding.

For a multiple gateway system it is not that simple. For reference see: Remote Tag Provider History Queries

Once you have those pieces of information then it's just a matter of parsing the given tag path into it's parts and reconstructing them into the historical tag path.

So, given the following path [default]_Simulator_/Ramp/Ramp0, in a single gateway system where the System Name is "Ignition_Gateway", and the historical provider is "History_Provider". This expression will return the correct historical tag path.

Expression for Single Tag Path
transform(
	parsePath({this.custom.tagPath}),
	lower(
		concat(
			'histprov:', coalesce(value()['histprov'],{this.custom.histProv}),':/',
			'drv:',coalesce(value()['drv'],{this.custom.gatewayName}),':',coalesce(value()['prov'],'default'),':/',
			'tag:',value()['tag']
		)
	)
)

NOTE: I don't believe that paths are case sensitive, but if you want this to match the output from say the Power Chart, then the lower() does that.

Output
histprov:history_provider:/drv:ignition_gateway:default:/tag:_Simulator_/Ramp/Ramp0

If you provide parsePath with multiple strings, then it returns a row for each string. Modifying the above expression with forEach will allow you to return a list of multiple historical tag paths.

Expression for Multiple Tag Paths
transform(
	parsePath({this.custom.tagPath},{this.custom.tagPath1}),
	forEach(
		value(),
		lower(
			concat(
				'histprov:', coalesce(it()['histprov'],{this.custom.histProv}),':/',
				'drv:',coalesce(it()['drv'],{this.custom.gatewayName}),':',coalesce(it()['prov'],'default'),':/',
				'tag:',it()['tag']
			)
		)
	)
)
Features employed
  • Iterables General behavior of all iterators.
  • transform: Makes a snapshot of the dataset returned from parsePath.
  • parsePath: returns QualifiedPath parts as a dataset
  • coalesce: returns the first non-null argument
  • concat: returns the concatenation of all provided arguments
  • lower: returns the provided string in all lower case
  • forEach: loops through each row in the dataset.
  • it: Delivers the row from this iteration through the dataset.

@lrose - surely the intent of parsePath() was to make this process purely dynamic (as opposed to hardcoding the path part names as your examples do)?

@hunterdg - My examples are dynamic. Given a tag path, this will output a historical tag path. There are a couple of other bindings involved as explained, but it will change dynamically with the path.

The intent of parsePath is to break a string path down into its constituent parts. A simple tag path by itself does not contain the needed information to build a corresponding historical tag path. Information must be retrieved and inserted. parsePath on its own does not do any extra look up.

@hunterdg I suppose, maybe this is what you're looking for, although, this will not provide a valid historical tag path, unless it is handed a valid historical tag path.

Expression
transform(
	parsePath({this.custom.tagPath},{this.custom.tagPath1}),
	forEach(
		value(),
		groupConcat(
			flatten(
				forEach(
					columnsOf(it()), // columns of dataset returned from parsePath
					forEach(
						where(
							it(1), // rows of dataset returned from parsePath
							!isNull(it()[it(1)[0]])
						),  // all non-null values in this row
						concat(it(1)[0],':',it()[it(1)[0]]) // columnName + : + columnValue
					)
				)
			),
			':/'
		)
	)
)

@lrose - This does indeed work for a non-historical (but PowerChart-compatible) QualifiedPath as you point out, but I’m somewhat disappointed as it seems to be basically the consolidated version of my original method (which is still very much appreciated). I assumed there must be an easier way to concat dataset rows interpolated with their column names. Fairly certain QualifiedPath.toString() performs the equivalent and is much more approachable, despite the overhead (and regardless, I believe a scripting context is needed to build a “proper” historical QualifiedPath object anyway).

EDIT - @pturmel’s original post mentions “(I've been frustrated with the need for scripts to produce proper historical paths from simple tag paths.)”…

I now see that parsePath() requires passing a string (with histprov + drv) to return a historical path - guess I missed that critical (and obvious in hindsight) piece - so i might as well just pass the historical paths strings straight to PowerChart. It is still useful to derive non-historical stringified qualified paths from a QualifiedPath object though.

Feel free to cleanup this post. I consider my question “mostly” answered - but I’m still uncertain as to the rationale for parsePath() - as it most certainly does not seem to "produce proper historical paths from simple tag paths” as the original post suggests.

@hunterdg parsePath is best used when using tags() to read the Storage Provider property of tags. If you use parsepath on the returned full tag path, you can use its parts with the returned value, and separately provided system name, to construct the history qualified path from a basic tag path. A big concat() operation.

@pturmel :folded_hands:

merging lists of dicts

given identical-length lists of dicts, how to merge nested dicts of each (into a single list)

Merge corresponding dictionaries in lists
transform(
	{path.to.list.a},
	forEach(
		{path.to.list.b},
		asMap( // Convert lists of pairs to a dictionary
			asPairs( // convert dictionaries to lists of pairs
				it(),
				value()[idx()],
				asMap( // bonus
					'additionalKey1', 'additionalValue1',
					'additionalKey2', 'additionalValue2'
				)
			)
		)
	)
)

bonus: how to add additional key/value pairs that don’t exist
bonus2 (my actual use-case): insert dicts from one to another as nested dicts

Inject dictionaries from list A into a key of dictionaries from list B
transform(
	{path.to.list.a},
	forEach(
		{path.to.list.b},
		asMap( // Convert lists of pairs to a dictionary
			asPairs( // convert dictionaries to lists of pairs
				it(),
				asMap( //bonus2
					'nestedKey', value()[idx()]
				)
			)
		)
	)
)

Reformatting a list of dictionaries, like the default data of the Perspective table, back into a dataset. Extracting nested .value elements where necessary.

Dataset from List of Mappings
unionAll(
	asMap(
		'city', 'str',
		'country', 'str',
		'population', 'I'
	),
	forEach(
		{/root/Table.props.data},
		asList(
			unMap(it()['city'], 'value'),
			unMap(it()['country'], 'value'),
			unMap(it()['population'], 'value')
		)
	)
)
1 Like