My Perspective project (currently stable 8.0.11 (b2020041411)) will primarily use Windows 10 tablets (Microsoft Surface Pro) with touch screens but there is a need for it to be compatible with desktop PCs (mouse clicks) too. Both will use chrome browsers. I need a button on a view to complete the same script whether it is touched or clicked.
I quickly realized that I needed to use the onTouch[something] and onMouse[something] events to distinguish mouse clicks and screen touches on the same button. However, I’ve noticed the mouse[up/down] events firing when touching a button on a touch screen.
To demonstrate, I have made a perspective view with a button and two labels (lblClick and lblTouch).
To the button, I’ve added scripts to the onMouseDown and onMouseUp events to change the lblClick.text prop.
In this video of me touching the button you can see the lblClick text value change when I’m using a touch screen.
https:/uploads/iatesting/original/2X/2/2a2d4ef9e496a1f2df8f9ee5ea91e324c947d26d.mp4
How do I avoid having the onMouseDown event from triggering?? It sometimes causes the script to run twice (once in onMouseDown and again in onTouchEnd) when button is touched.
In my application the button will be used as a momentary button. When the button is held down a motor moves and when the button is released the motor stops. In some cases when the button is held down the motor moves but when it is released the motor does not stop because the MouseDown event is trigger again, causing the motor to move.
I know this doesn't answer your mouse event question, but you should give this thread a read and evaluate exactly how your push-to-run logic is working. There is potential for danger here if done incorrectly.
Thank you @bmusson that looks like A LOT of good reading, I had not come across that thread yet. I’ll dig into.
I’m not too concerned about latching but it would be nice to ensure it doesn’t happen. My system is relatively small and the motors are small linear motion stepper motors with limit switches. So latching will be more of a nuisance than a danger.
I was about to cite the same thing. We moved to using Phil’s logic recently, using the timer and incrementing value in the PLC instead of a simple digital as we had issues with the digital being held on with a noticeable delay after the button was released in SCADA.
There is no difference between the desired behaviors of the onMouseDown and onTouchStart events. I’ve tried to just use the onMouseDown event but it does not always fire with the touch screen tablet. As the video shows the onMouseDown event only fires, randomly (may 1 out of 15 clicks).
I took a look at the mozilla link. Is it possible to utilize the preventDefault() function from within the perspective script?
Looks like preventing the event’s default is a setting within the component events, but its not something that’s exposed in a UI to the user. Manually editing the JSON config doesn’t seem to pass the setting through so that’ll be something that we need to fix on our end. I’ve updated the title with the ticket number, and will keep you posted when it gets fixed.
I just made a blank perspective project with 3 buttons and 1 active label. My current version of Ignition is a nightly 8.0.14 (b2020051802). I used this project to test the mouse up and down events using a touch screen.
There is a ‘Down’ button that only has an ‘onMouseDown’ script event self.getSibling("lbldown").props.text="mouseDown"
There is an ‘Up’ button that only has an ‘onMouseUp’ script event self.getSibling("lbldown").props.text="mouseUp"
Finally, the ‘Click’ button only has an ‘onClick’ script event self.getSibling("lbldown").props.text="mouseDown"
On a Windows 10 tablets (Microsoft Surface Pro), I used Chrome (83.0.4103.61 64bit) and Firefox (77.0.1 64bit) to test the project and got the same results from both, as seen in the video below.
I first demonstrate the buttons with a mouse to show they work as expected with a mouse. As you can see when I use the touch screen, the mouse down event seems to fire when I release the button unless I hold the button too long than it doesn’t fire at all. The same exact behavior is true with the mouse up event. The click event seems to work as I would expect.
Just tried safari and chrome on iOS and I see the same behavior.
After some investigation, I think your best bet here is to look at the user agent string/device type in the session props and fire the appropriate event as needed. We have added preventDefault and stopPropagation options to events; however, due to a limitation in React, there’s no current way to allow for preventDefault on passive event handlers (ie onTouchStart). Let me know if you have any other concerns regarding this issue.
Thanks, for taking a look at this. I’ve spent a little bit of time this morning playing with the mouse/touch up/down event in chrome. Within the dev tools I was entered monitorEvents($0) after inspecting a button element. This allowed me to monitor the events being fired. And it’s pretty obvious that the chrome mouseup/down events don’t always fire on a touch device.
I did notice however that pointer events fire, what seems to be, very consistently for both mouse and touch interfaces. While it seems to have been around for some time (Link) it’s a new concept to me.
Has there been any thought put into using the pointer events in perspective? Should I put in a feature request?
Or… is there a way for me to gain access to these events from perspective now so I can test it out?
I don’t believe there was any discussion about putting pointer events into Perspective. Looks like pointer events have only been supported in very recent versions of some browsers. However, you are still encouraged to make a feature request.
There’s another suggestion I forgot to mention. Depending on your application, try putting a touch listener on a primary view early on in the application to update some session prop (ie isTouchCapable). Then you can use this prop to prevent the mouse down event from firing.
We’ve seen similar issues. The primary use case of the views we are developing is for using them on a touch screen. As such we had used the onTouch events for the best performance. We did this when running the application in Chrome on the client. The issue that @dkylep is showing is something we saw when using onMouse events like down, or up, or click. If you keep your finger on the screen long enough it appears to be registered as a touch event and therefore ignores the mouse event. What is odd is that even onMouseDown does this, which implies that when touching that button the system does not immediately recognize a mouse event and only considers it a mouse down if you release quickly. The same issue comes into play if you’re working quickly and your finger slides off the button before you are done releasing your finger the button will not fire, making the operator have to be more precise with their touches. For this reason we went with onTouch events… but those have some pitfalls too.
If I am viewing the application from a desktop now I need to enable Chrome developer mode to be able to use the buttons, which is not ideal. Furthermore, with the release of Workstation, we found that even on a touchscreen device Workstation would not recognize the onTouch events even when touching the device. Instead, the onMouse events worked, but actually behaved like the onTouch did in Chrome (i.e. acknowledging an input if I press on a button and slide off before releasing - onMouseDown is registered immediately like you’d expect.)
We really do not want to have the code exist in multiple actions as that seems like a lot of waste and more things to change or update if ever needed. If there is a way to set up Chrome to register the touch like a click immediately that may solve the problem. We found it odd that Workstation was different, though at the end of the day I would say the mouse events are preferred since of their ability to be used on any device. Workstation should at least respond to the touch events though I would think, right?
I’ve just run into this issue recently, where the touch interface will fire a MouseDown but not the MouseUp on momentary buttons. Attempting to implement touchStart and touchEnd but can’t find any details on how to detect if the device is touch capable.
Can anyone shed any light on this, I’m trawling through the system scripting functions looking for something.
I took ayu’s advice. In my application I have an initial screen that acts like a navigation screen so the user has to click (or touch) on a navigation icon.
I have found that the ‘onTouchStart’ event always fires so I created an onTouchStart event on the navigations icon. That event only fires if someone really does touch the navigation icon on the screen.
To record if the client is touch capable I created a session property call ‘isTouchCapable’ (default value is false) and I set it to true when the navigation icon fires the onTouchStart event.
I have ended up using the onTouchStart event and a onMouseClick event to fire a timer in a separate thread, this seems to be the best way to do this. Now the timer length can also be configured for variable delays as desired on individual embedded views.
To reliably set your isTouchCapable property, you would need to include a onTouchEvent on all views and interact-able objects within those though?
Thank you this information about pointer events.
We solved our issue (i.e., consistency between mouse clicks on desktop and touches on mobile devices) using 3 events: onPointerDown, onPointerUp & onPointerOut.