• 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
[Android] Support for Androids new camera gesture control
#1
Shocked 
As can be read here, Android 11 and 12 start to support so called Camera Switches (gesture controls using the front camera of your smartphone or tablet). I already tried this functionality by installing the beta APK on my Oppo Reno 10x Zoom (Android 11), and it works wonderfully! It opens up great possibilities for MobileSheets: I can now move to the next page of a song by raising my eyebrows or by opening my mouth. The camera detects this extremely well (and you can even tweak the threshold settings to make it even better).

The big advantage of this method is that we don't need a touch screen or bluetooth device anymore. We could just connect our regular Android smartphone (which only needs to support DP alt mode) to a nice anti-glare external display, like a 17.3 inch portable monitor (using just one USB-C cable) or a cheap ~100-dollar 24 inch monitor (separately powered by a PD power bank).

There's still one catch though: the Camera Switches can be tied to one of several Android actions, but only one of those actions is compatible with MobileSheets pedal actions, namely the 'Back' action. At least one of the other actions (like 'Next' or 'Previous') should be able to work as a pedal trigger as well... Could you please make this happen, Mike?

[Image: Camera-Switches-Open-Mouth-Action-498x1024.jpg][Image: Camera-Switches-Raise-Eyebrow-action-498x1024.jpg]
Reply
#2
I'd definitely like to add support for those other actions, but I'm going to need to find some documentation on how those actions are being triggered, and what events my app needs to listen to in order to respond to them. If they aren't coming in as key commands, that means I would need to either extend the UI for pedal actions to somehow allow those actions to be selected, or I need a different part of the settings where you can assign actions in MobileSheets to the actions triggered by the face gestures. There are other APIs for detecting face gestures that will work on older Android devices, but that will probably be more complicated to implement, as I will have to write the gesture detection code myself. I'd then also need to support using the face gesture detection built into MobileSheets or the gesture detection available in the Android itself with Android 11 or higher. 

Mike
Reply
#3
I would say the action 'Home' should be similar to 'Back' (which MobileSheets already detects), or am I wrong?
Ideally, the actions "Previous" and "Next" should be detected (so that the Back and Home buttons keep their own functionality while using MobileSheets).
But I don't know any documentation for selecting the next or previous element in Android.....

Please let me know when you're making any progress on this!
I'm already dreaming of using this technique with a 25 inch e-ink monitor (in two-page mode).......
Reply
#4
@ Mike: you mention other APIs that will work on older Android devices, but I've found out that Camera Switches APK works on older Android versions as well! APKMirror lists it for Android 6 and newer, and I just tested it on a HTC 10 with Android 10 and it works great! That means this 'camera pedal' will work for most of the Android phones and tablets featuring a front camera. Another reason to make another action (besides the already supported 'Back' action) detectable by MobileSheets as soon as possible Angel When do you think this can be done?
Reply
#5
The back button trigger fires an event through the Android framework that my app just listens to (the onBackPressed method that every Activity triggers). There is no equivalent for a Home button, or previous or next. I can't provide any kind of estimate for the level of effort involved here without knowing how to listen for those. If they came in as key press events, then MobileSheetsPro would already support them with the way I handle pedal actions. 

It's great that the Camera Switches APK works on older Android versions, but unless I have access to the source code for that application, that doesn't really help me understand what exactly needs to be implemented. I need documentation on what libraries have to be integrated, what events have to be listened to, etc, etc. Just knowing that there is another application out there that supports these things won't bring me any closer to implementing it myself unless there is source code I can look at, or an existing library that is well documented. In order forum posts about this topic, I pointed to libraries that Google makes available for processing facial gestures, but it doesn't provide the kind of functionality shown in that APK you are using. So I'm going to need additional information before I can move forward with anything.

Thanks,
Mike
Reply
#6
Thanks for your reply! Well, the Camera Switches functionality (included in the latest beta of the Android Accessibility Suite) works with several Android events, including the Back event. So instead of just listening to 'onBackPressed', MobileSheets should eg. also listen to 'onFocusChanged'. Because the 'Previous' and 'Next' actions (that move focus to the previous or next selectable item) can be assigned to Camera Switches. After reading the documentation for Android developers, I think the correct events to listen to in this case are onFocusChanged (true, FOCUS_BACKWARD, null) and onFocusChanged (true, FOCUS_FORWARD, null). Does this make any sense?
Reply
#7
I'd have to test to know for certain, but listening for a focus event wouldn't really make sense, because you have to know which component/view on the screen you are attaching the onFocusChanged listener to. Focus can change if you tap in the search text field in the filters, for example, or if you trigger a dropdown, or tap other components on the screen. It definitely would not make sense to attach an onFocusChanged listener to every component in the entire application and it would not make sense to attach it to the parent view of the entire hierarchy because the children will most likely consume the event and not pass it back to their parent. There are next/previous IME events (https://www.geeksforgeeks.org/how-to-han...n-android/ for example), but that only makes sense if an edittext control has focus and the user is interacting with the virtual keyboard. If the camera switches app is generating IME events when a text field doesn't have focus, that wouldn't make a whole lot of sense. Looking at the list of actions that can be triggered, they seem to be all over the place:
  • Pause Camera Switch
  • Toggle auto-scan (disabled)
  • Reverse auto-scan
  • Select
  • Next
  • Previous
  • Touch & hold
  • Scroll forward
  • Scroll backward
  • Home
  • Back
  • Notifications
  • Quick Settings
  • Overview
What does it mean to trigger "Overview" or "Quick Settings"? What does it mean to trigger "Reverse auto-scan"? What does it mean to trigger Touch & Hold when you aren't actually touching the screen? Is the idea that you trigger focus changes using accessability features, and once you've put focus on certain things, then you trigger the face gesture? That would suddenly make a lot more sense, as you could shift focus to a button, then trigger the "Touch & hold", or if you are actively entering text, you could be using speech-to-text, then use the face gesture to trigger "Next" to go to the next text field. If that's how these are intended to be used, it means they aren't really for general use. Applications can't listen specifically for these events, as they are triggering functionality the same way as a finger would. Does "Scroll forward" trigger a page down? What mechanism is it using to scroll forward otherwise? If it's triggering page down, then it could be used with the pedal actions. If it's doing something else, that would mean it's probably using something in the Android framework to try to grab the current control with focus and force it to scroll some amount downward, which would be impossible for my app to detect. Honestly, what I need is not to try to react to the camera switches app triggering actions, I just need the code they used to detect facial gestures. Then I can add functionality to trigger anything in MobileSheets without having to rely on an external application. 

The other thing that is important to consider is that the camera switches app is going to be draining your power in the background, and it could be a significant power drain. It will also sit there detecting facial gestures when you aren't looking at any songs, or when the app is idle, which are both suboptimal. I still think it's going to be much better for me to add the necessary changes to detect facial gestures on my own, using ML Kit:

https://developers.google.com/android/re.../face/Face
https://www.journaldev.com/15629/android-face-detection

For some simple gestures, it should work pretty well I think. Based on what I'm seeing, I can probably implement all of the same gestures detected by the camera switches app. Of course there will be a fair amount of effort involved with this. If someone is a programmer and wants a side project, I'd be happy to work with someone if they want to implement all of the necessary code to detect all the various gestures, trigger callbacks that I can listen to, and handle easily starting/stopping the detection as needed. Otherwise I will add this to the list of things to work on after the iOS version is released.

Mike
Reply
#8
(08-21-2021, 04:17 AM)Zubersoft Wrote: I'd have to test to know for certain, but listening for a focus event wouldn't really make sense, because you have to know which component/view on the screen you are attaching the onFocusChanged listener to. Focus can change if you tap in the search text field in the filters, for example, or if you trigger a dropdown, or tap other components on the screen. It definitely would not make sense to attach an onFocusChanged listener to every component in the entire application and it would not make sense to attach it to the parent view of the entire hierarchy because the children will most likely consume the event and not pass it back to their parent.


Please do test this! Because I think what you say may count for onFocusChanged in general, but not when specified as focus_forward or focus_backward. 
Focusing by the way doesn't seem to have anything to do with touching or tapping. Because when I use the 'Next' (focus forward) or 'Previous' (focus backward) action, the selectable items on the display get a focus border around them, which doesn't happen when tapping or touching elements:

[Image: FPa1w7X.png]


In short: if Mobilesheets maps focus_forward or focus_backward to a pedal action, it shouldn't really interfere with other uses of MobileSheets, right?


(08-21-2021, 04:17 AM)Zubersoft Wrote: What does it mean to trigger "Overview" or "Quick Settings"? What does it mean to trigger "Reverse auto-scan"?


The 'Overview' action simply does the same thing as clicking the square button on Androids navigation bar: it shows the apps that are currently open.
'Quick Settings' pulls down the notification bar with the quick settings.
'Auto-scan' and 'reverse auto-scan' scan the current view for selectable elements.
I'm not sure about the other things you mention.


(08-21-2021, 04:17 AM)Zubersoft Wrote: Honestly, what I need is not to try to react to the camera switches app triggering actions, I just need the code they used to detect facial gestures. Then I can add functionality to trigger anything in MobileSheets without having to rely on an external application. 


I understand this could be best (and multi-platform proof), but that indeed must take a lot of time and effort to implement. Why not - at least for now - try to support Androids own camera switches, since they will be included not just in some shady app but in the official Android Accessibility Suite that's already included in most Android devices? I'm not asking to let MobileSheets react to the camera switches, but I'm asking for at least one extra Android action (besides the already working 'Back' action) which can already be triggered by a camera switch, to be also detectable by Mobilesheets (for use as a Pedal Action). This could be very helpful for users of a Windows (or non-camera Android) device as well: if they have an Android phone, they can use 'Connect Tablets' to use the camera switches on their phone to turn pages on their sheet music device.


(08-21-2021, 04:17 AM)Zubersoft Wrote: The other thing that is important to consider is that the camera switches app is going to be draining your power in the background, and it could be a significant power drain. It will also sit there detecting facial gestures when you aren't looking at any songs, or when the app is idle, which are both suboptimal.


It doesn't seem to drain more power than any other bluetooth pedal. Besides, it doesn't really run in the background, as you'll see a blue or red face icon when it's activated (telling you if your face is detected or not), and you can easily set up an easy shortcut to turn it on or off (like press and hold both volume buttons simultaneously):

[Image: QphKsYz.jpg]
Reply
#9
You still have to have a callback registered to receive any kind of focus notification, regardless of the focus next or focus previous designator. There isn't a general high level callback for that that I'm aware of. You mentioned wanting MobileSheetsPro to listen for "focus_next" or "focus_previous" pedal actions, but those aren't key commands that I can listen for. They are triggering OS level focus adjustments by looking at what component currently has focus, and requesting that the next focusable component on the screen get focus. If you are viewing a song, nothing technically has focus, and even if it did, I wouldn't want the focus being changed in the app in an invisible manner. All of the actions triggered by camera switches result in undesirable behavior that I wouldn't be able to stop, other than the back button which has a special callback. I really don't think it's worth extending additional effort researching and testing to figure out if any of those would possibly work, especially as I'm trying to get the iOS version finished. If someone else with Android experience wants to run tests to see if you can hook into this in a manner that won't conflict with the normal usage of the app, that would be great. Otherwise I would rather wait until I have time and work on incorporating the facial gestures in MobileSheetsPro itself rather than relying on something external that doesn't really offer the functionality I want.

Thanks,
Mike
Reply
#10
Thanks for thinking this through. I'll patiently wait for this functionality, and use Camera Switches with the Back button only until then. ForScore already has face gestures built-in, so that could be another reason for you to implement it, given the fact that you will have to compete with them on iOS. Good luck with all your efforts!
Reply
#11
That is true, although the feature in forScore is only available to Pro users who pay the subscription model pricing. By contrast, if I add this feature, everyone would have access to it without having to pay extra.

Mike
Reply
#12
Yes, forScore would have to fear for its future Tongue

Another benefit of implementing face gestures in MobileSheets itself, would be support for older Android devices, like the 18.4 inch Samsung Galaxy View with Android 5 (where the Accessibility Suite requires Android 6+).
Reply


Digg   Delicious   Reddit   Facebook   Twitter   StumbleUpon  


Users browsing this thread:
1 Guest(s)


  Theme © 2014 iAndrew  
Powered By MyBB, © 2002-2021 MyBB Group.