I am looking for some advice on touch events vs. mouse events for running button scripts. I am using a Hope HMI as the terminal screen and an android tablet as a remote viewer. The scripts work great when executing from my laptop or from the Hope HMI, but I have to repeatedly tap buttons of the HMI to fire the script. Also, I tried adding scripts to both event types on the same button but that didnt help. Any advice on how manage this without creating HMI buttons and Tablet buttons?
When using a button component I have always had the best luck using the “On Action Performed” event. Works with mouse, Touchscreen, and Perspective app.
If you’re using a component that doesn’t have ‘On Action Performed’ e.g. an embedded view, label, etc, using onClick may be causing your problems. onClick requires mouse-down and mouse-up (click and release) to occur on the same pixel. Touchscreens tend to have some jitter while you press, hence the script not working unless you press very diligently. I tend to use mouse down in these scenarios.
Note: this knowledge is from Vision but I think it applies to Perspective onClick too. Also, I think there’s nuance based on your hardware’s touch drivers and whether they’re programmed to report mouse or touch events (or both?).