12-01-2017 12:30
12-01-2017 12:30
I'm developing a watchface that has touchable elements down the right-hand side. The elements display fine, and all are generally responsive to onTouch(). However, it seems that touch events for the top and bottom corners of the screen do not get passed through to my code (whereas all others do).
I'm wondering if this could be an inappropriate consequence of the physical buttons being enabled for watch faces, and the system code for that is treating the screen corners as though there were Combo Buttons positioned to correspond to the buttons (even though corner touches don't actually do anything). Thus, the events are being swallowed by the system rather than being passed through to user code.
If this is the case, I suggest that the system should not handle RHS corner touches unless doing so is actually intended to have the same effect as pressing a physical button. Otherwise, touches should be passed through to user code.
I haven't investigate what happens for non-watchface apps.
12-03-2017 00:07
12-03-2017 00:07
I've changed my theory. The whole of the bottom of the screen (and probably the whole of the top of the screen) seems to be unresponsive to onClick(). This may be because the events are being swallowed because they could be the start of drags up or down to access music or notifications. It would be good if the events could be passed through to watchface code if drags aren't actually detected. (This could require some tolerance because a touch won't generally be totally static.)