Android

Android 12 improves gesture navigation with machine learning

From Android 9.0 Pie, Google has taken significant steps with the way in which one can navigate through the Android operating system. Starting with Android 12, Google even offers a limited amount of ‘Machine Learning’ to adapt the gesture navigation to the way someone uses their phone.

Gesture navigation in Android 12

Since the introduction of gesture navigation in Android to replace the old software buttons, complaints about how the navigation method works have continued to pour in. In many apps, the regular navigation actions within applications are seen as system navigation actions, as a result of which you are suddenly kicked out of the app, or inadvertently return to the previous page. A solution was introduced from Android 10: developers could manually set exclusion zones.

With such exclusion zones, gesture navigation within a certain area was blocked. Furthermore, Android was provided with sensitivity settings. Via the settings it was possible to set how quickly the system would respond to a navigation action. For Android 12, Google is working on solutions that will tailor the operation of gesture navigation to the wishes of users, Quinny899 reports on XDA-Developers. The developer found two of its apps in a list of 43,000 apps monitored in the new operating system for navigation actions.

Current customization options of navigation in Android 12, Image: Android Police.

Google uses a ‘TensorFlow Lite’ model for this, with which machine learning can take place on a phone. According to Quinny899, in the EdgeBackGestureHandler, which handles gesture navigation in Android 12, Google specifically mentions a ‘file’ within which data from the back gesture is kept. Using a machine learning model, it is possible to recognize a certain behavior and to adjust the gesture navigation based on the results of this model.

Fast gesture navigation actions

Google also made another change in the gesture navigation of Android 12, as described by Android Police. In Android 12, gesture navigation actions work to return to the previous screen or the home screen from full screen in one go. In Android 11 you will first have to tap the screen once and then perform the navigation action. It does seem to require an adjustment in the Android app: this adjustment does not work in Twitter.

From the latter adjustment to the gestures, you can expect that Google will actually work it out for the stable version of Android 12. Whether Google will also work out the machine learning model for the stable version – which is probably at the end of the third quarter launch – is a different story. Currently, a flag needs to be changed in Android 12 to enable machine learning: the adjustments are therefore not automatically noticeable.

Android 12 improves gesture navigation with machine learning

Do you hope Google continues the machine learning features in Android 12, or are you satisfied with the way gesture navigation works on your phone? Let us know for sure in the comments, and don’t forget to mention which Android version you’re using.

Leave a Reply

Your email address will not be published. Required fields are marked *