Please add ids to perspective components for writing tests

Good to know I was at least thinking on the right path

Let’s say I have ~20 script modules with 10-30 functions in each, I am trying to think of the cleanest way to write tests for them. On one hand, If there is no ignition specific code it can be easy to write a pytest for it. On the other hand, creating tons of views just to support automated testing feels like it would add a lot of bloat to the project itself, instead of outside the project.

I guess one thing you could do is create a generic screen that takes some inputs or query params to define the scripts you want to validate, and it just dumps that data onto labels. That way you have one generic test screen, and for each test you want you just change the parameters you provide the page. Does that sound roughly like the path you would choose in this case?

Normally these tests would be written during a projects development, not after the fact, this one is a special case.

What do your “tests” consist of? Are you checking just a returned value? Are you checking the type? Are you checking lengths of lists? Any of these would have a different recommended route.

If you get tricksy with your Python, you could have a View with two text fields, a numeric entry field and a Flex Repeater which renders a View which contains just a Text Field where that Text Field is bidirectionally bound to an in/out param for the view. You would also need a button to execute the generated function.

Use the NEF to drive how many instances are displayed in the repeater. Use TF1 to drive the module/package, and TF2 to drive the function in use. Finally, iterate over the instances in the repeater to drive the argument list for the function. on button click, generate the function and set the returned value to whatever Labels you deem necessary. I recommend at least one for the value and one more with an expression binding: typeOf({sibling.props.text}) (pseudo-code - you’l want to properly identify the sibling Label’s text prop).

I didnt so much mean how to solve the specifics of the test, as opposed to your thoughts on the abstract of "Ignition application tests" in general.

Your example here essentially answers what I am looking for. Write a generic view, that allows you to specify what you want to test, execute it with parameters, and then serialize that data back onto the screen in the form of text. Then Execute each of your 'tests' by providing the page with what's needed to call the function, and verifying the output in the label (or whatever makes most sense) against your desired output. This would essentially create a "test fixture" that allows you to dynamically call any of your scripting functions directly for unit tests. This could feasibly also be used to write unit tests for individual views as well, with an embedded view and a view path text box.

This would allow us to catch simpler errors (say a named query fails), or syntax errors, etc. the low hanging fruit in automated testing.

I have actually done something like this in the past, just not exactly as described here, I just wasn't sure if using the page to scrape out the information was the cleanest path, or just a complicated workaround.

It would be nice to be able to externally execute code directly inside the scope of Ignition without a screen, because then testing would be much easier. However, I recognize where that's probably more difficult than it sounds.