08-21-2021, 07:55 PM
(08-21-2021, 04:17 AM)Zubersoft Wrote: I'd have to test to know for certain, but listening for a focus event wouldn't really make sense, because you have to know which component/view on the screen you are attaching the onFocusChanged listener to. Focus can change if you tap in the search text field in the filters, for example, or if you trigger a dropdown, or tap other components on the screen. It definitely would not make sense to attach an onFocusChanged listener to every component in the entire application and it would not make sense to attach it to the parent view of the entire hierarchy because the children will most likely consume the event and not pass it back to their parent.
Please do test this! Because I think what you say may count for onFocusChanged in general, but not when specified as focus_forward or focus_backward.
Focusing by the way doesn't seem to have anything to do with touching or tapping. Because when I use the 'Next' (focus forward) or 'Previous' (focus backward) action, the selectable items on the display get a focus border around them, which doesn't happen when tapping or touching elements:
In short: if Mobilesheets maps focus_forward or focus_backward to a pedal action, it shouldn't really interfere with other uses of MobileSheets, right?
(08-21-2021, 04:17 AM)Zubersoft Wrote: What does it mean to trigger "Overview" or "Quick Settings"? What does it mean to trigger "Reverse auto-scan"?
The 'Overview' action simply does the same thing as clicking the square button on Androids navigation bar: it shows the apps that are currently open.
'Quick Settings' pulls down the notification bar with the quick settings.
'Auto-scan' and 'reverse auto-scan' scan the current view for selectable elements.
I'm not sure about the other things you mention.
(08-21-2021, 04:17 AM)Zubersoft Wrote: Honestly, what I need is not to try to react to the camera switches app triggering actions, I just need the code they used to detect facial gestures. Then I can add functionality to trigger anything in MobileSheets without having to rely on an external application.
I understand this could be best (and multi-platform proof), but that indeed must take a lot of time and effort to implement. Why not - at least for now - try to support Androids own camera switches, since they will be included not just in some shady app but in the official Android Accessibility Suite that's already included in most Android devices? I'm not asking to let MobileSheets react to the camera switches, but I'm asking for at least one extra Android action (besides the already working 'Back' action) which can already be triggered by a camera switch, to be also detectable by Mobilesheets (for use as a Pedal Action). This could be very helpful for users of a Windows (or non-camera Android) device as well: if they have an Android phone, they can use 'Connect Tablets' to use the camera switches on their phone to turn pages on their sheet music device.
(08-21-2021, 04:17 AM)Zubersoft Wrote: The other thing that is important to consider is that the camera switches app is going to be draining your power in the background, and it could be a significant power drain. It will also sit there detecting facial gestures when you aren't looking at any songs, or when the app is idle, which are both suboptimal.
It doesn't seem to drain more power than any other bluetooth pedal. Besides, it doesn't really run in the background, as you'll see a blue or red face icon when it's activated (telling you if your face is detected or not), and you can easily set up an easy shortcut to turn it on or off (like press and hold both volume buttons simultaneously):