I'd have to test to know for certain, but listening for a focus event wouldn't really make sense, because you have to know which component/view on the screen you are attaching the onFocusChanged listener to. Focus can change if you tap in the search text field in the filters, for example, or if you trigger a dropdown, or tap other components on the screen. It definitely would not make sense to attach an onFocusChanged listener to every component in the entire application and it would not make sense to attach it to the parent view of the entire hierarchy because the children will most likely consume the event and not pass it back to their parent. There are next/previous IME events (https://www.geeksforgeeks.org/how-to-han...n-android/ for example), but that only makes sense if an edittext control has focus and the user is interacting with the virtual keyboard. If the camera switches app is generating IME events when a text field doesn't have focus, that wouldn't make a whole lot of sense. Looking at the list of actions that can be triggered, they seem to be all over the place:
The other thing that is important to consider is that the camera switches app is going to be draining your power in the background, and it could be a significant power drain. It will also sit there detecting facial gestures when you aren't looking at any songs, or when the app is idle, which are both suboptimal. I still think it's going to be much better for me to add the necessary changes to detect facial gestures on my own, using ML Kit:
https://developers.google.com/android/re.../face/Face
https://www.journaldev.com/15629/android-face-detection
For some simple gestures, it should work pretty well I think. Based on what I'm seeing, I can probably implement all of the same gestures detected by the camera switches app. Of course there will be a fair amount of effort involved with this. If someone is a programmer and wants a side project, I'd be happy to work with someone if they want to implement all of the necessary code to detect all the various gestures, trigger callbacks that I can listen to, and handle easily starting/stopping the detection as needed. Otherwise I will add this to the list of things to work on after the iOS version is released.
Mike
- Pause Camera Switch
- Toggle auto-scan (disabled)
- Reverse auto-scan
- Select
- Next
- Previous
- Touch & hold
- Scroll forward
- Scroll backward
- Home
- Back
- Notifications
- Quick Settings
- Overview
The other thing that is important to consider is that the camera switches app is going to be draining your power in the background, and it could be a significant power drain. It will also sit there detecting facial gestures when you aren't looking at any songs, or when the app is idle, which are both suboptimal. I still think it's going to be much better for me to add the necessary changes to detect facial gestures on my own, using ML Kit:
https://developers.google.com/android/re.../face/Face
https://www.journaldev.com/15629/android-face-detection
For some simple gestures, it should work pretty well I think. Based on what I'm seeing, I can probably implement all of the same gestures detected by the camera switches app. Of course there will be a fair amount of effort involved with this. If someone is a programmer and wants a side project, I'd be happy to work with someone if they want to implement all of the necessary code to detect all the various gestures, trigger callbacks that I can listen to, and handle easily starting/stopping the detection as needed. Otherwise I will add this to the list of things to work on after the iOS version is released.
Mike