Recreate Designer's Script Console in Perspective?

I am having data analysts run scripts built around system.tag.queryTagHistory() in order to get "wide" tables out of our Historian (Postgres), suitable for use within analytics tools like JMP.

I need to add more analysts to this group, but I don't like giving them all full Designer access just to have them use the Script Console. Is there a way to recreate this function as a Perspective app so they can use it in their browser? Failing that, could I do it in Vision? Or are there other ways to go about this that I am missing? Thank you.

Is there a way to access the Ignition Gateway domain from an external Python scripting environment? Meaning that I could use system.tag.queryTagHistory() in another Python environment to access the Historian data?

Sure. A Perspective UI can certainly call this API, and produce CSVs or Excel files for download.

Some reading for you:

https://forum.inductiveautomation.com/search?q=system.tag.queryTagHistory%20system.perspective.download

Thank you for the info. It will be useful to build a CSV export application, with check boxes, pull-down lists etc. to configure the file to be downloaded.

I was also hoping to build a more generic-use Python script console though, since I so often use that for experimentation. ChatGPT and I could extract the script from a text area object and execute it using the Python exec(script) command, but getting the console output to appear in the Perspective window proved problematic. Many "can't write str to text stream" errors ensued.

Extremely dangerous in a Web Application. It is the poster child for "Remote Code Execution" vulnerability.

Note that the designer script console doesn't execute on the gateway. It executes on the designer workstation with workstation privileges. Perspective runs all its script on the gateway, where the slightest mistake (or malice) could be catastrophic.

That you are using ChatGPT tells me you are too inexperienced to to be trusted with such a tool.

When you are sufficiently experienced to not use ChatGPT, you will understand why it is a terrible idea, and not do it at all.

I'm now realizing that providing full scripting access could be just as dangerous as providing full Designer access, which is what I was trying to avoid with this exercise.

Belittling my skill and experience just because I've used ChatGPT was uncalled for, however.

This forum's rules strongly discourage use of any LLM--not quite banned, but close--for Ignition tasks.

On top of that, I've seen too many posts on other technical fora where an LLM produced utterly bogus code that fooled an inexperienced person. LLMs are really good bull-[expletive]s when they don't find the right answer in their training material.

My criticism of your use of ChatGPT is not personal at all. (I reject the "belittling" characterization. It's just an uncomfortable truth.)

Moving forward, consider making a Perspective user interface that accepts the common parameters you need for history queries and downloads, with a fixed script.

5 Likes

This is always going to get barked at no matter where it's posted because having exec in a position that accepts user inputs is a well known bad practice.

I'm aware that a lot of experienced developers have been creating novel ways to interface with chatGPT's API, but we don't see much of the cool stuff in this schema. The regular volunteers of this forum have been confronted with a lot of embarrassingly hallucinogenic LLM code, and from what I've read, other forums haven't fared that much better. I imagine that there a scenarios where the prompt outputs have been a success, but there simply isn't much cause to post code that isn't causing a problem, so we never see that.

Well earned cognitive biases aside, I am curious, what is your use case here? Why is code entered into a web facing script console being sent to chatGPT?

1 Like

Maybe you could set up a read only access to the database and skip the ignition part entirely for this ?