Random Musings on Q Beta 3
Each time Google releases a new
developer preview beta, I putter around
the API differences report
the high-level overviews,
and even the release blog post,
to see if there are things that developers should pay more attention to.
I try to emphasize mainstream features that any developer
might reasonably use, along with things that may not
get quite as much attention, because they are buried in the JavaDocs.
So, while keeping one eye on the Google I|O keynotes and stuff, I took a look at Beta 3.
So, What’s Up With Scoped Storage?
I’ll blog more about this later in the week, as I’m still sifting through what changed.
The next-generation gesture-based navigation scares me, as it does not work the way that I had been expecting. I thought that this was just going to be an extension of the 9.0 “pill”, where a particular swipe on the pill would represent back navigation. Instead, back navigation comes from an edge swipe, from either edge of the screen, for the full screen height. And, if last year is any indication, all Pixel 4 users may be forced to use this system.
If your app uses horizontal swipe gestures, this is going to suck for you and your users. Now, whether your gesture or the system back gesture gets triggered will be based on fairly subtle distinctions of where the finger started the gesture. You should start thinking about what you are going to do, should Google go ahead with this plan. Do you:
Eliminate your horizontal swipe gestures?
Do something to visually distinguish where your horizontal swipe gesture works, to help distinguish it from the invisible edge swipe area?
onBackPressed()and prompt the user “did you really want to leave this screen?”, if leaving the screen accidentally might make the user unhappy?
Leave it alone and hope for the best?
And, of course, if your app relies upon edge swipes — such as those
drawer apps that were all the rage a few years ago — you are in trouble.
Of course, those drawer apps are in trouble due to the apparent impending
If this will purely be an opt-in option for users for Q devices, then this is fine, though developers should plan ahead for this perhaps being mandatory in R. If this is mandatory for some device users for Q, as the 9.0 gesture navigation was for Pixel 3 users… we really should have received more warning about this.
What Was New Before And Is Now Newer?
Here are some changes to things that debuted in previous Q betas:
Background activity starts are really blocked now. Prior betas would still launch the activity from the background but show a
Toastindicating that this would eventually be blocked. Now, the
Toastis there (with an updated message), and the activity is not displayed. Presumably, that
Toastwill go away in a future beta. There is also a warning message (not an error) in Logcat from
Background activity start.
There are minor tweaks to the Bubbles API, removing the title and adding a requested height option.
Similarly, there are minor tweaks to the
BiometricPrompt.BuilderAPI, with a couple of methods renamed (e.g.,
Along the same lines, some
WebView-related classes and methods were renamed, where “renderer” (e.g.,
WebViewRenderer) is now “render process” (e.g.,
RoleManagerremains largely undocumented. However, it lost
FEATURE_FOLDABLEwas removed from
PackageManager, indicating that we will not have runtime ability to distinguish a foldable device.
NotificationStatswere removed from the SDK. The former surprises me, as I thought this was how Android was supporting adding replies and actions automatically to notifications.
What Is Newly New In Beta 3?
For a third beta, there is more new than I would have expected.
Of particular interest:
The new audio capture API is a nice addition, particularly for screen recording apps. Hopefully it will be easy for those apps to blend the video with the audio, as those are captured separately. The audio capture APIs are an add-on to the media projection APIs from Android 5.0 and will require user acceptance. By default, apps with a
Qwill allow their audio to be recorded, though there are opt-out options if this concerns you.
setAllowSystemGeneratedContextualActions()for controlling whether Google can add replies and actions automatically to your notification. If I understood correctly, these are by default set to
falsefor other types.
We now have a bit more information about dark mode and how it affects apps. The automatic
android:forceDarkAllowed="true"is amusing but you would really want to test your app thoroughly, rather than assume that this switch will yield perfect results. Their preferred approach is to use
DayNightthemes, except that implies that most Android developers actually grok the theme system, rather than simply cope with it.
bindService()now has a new four-parameter version that takes an
Executoras a parameter, where that
Executoris used for the
ServiceConnectioncallbacks. By default, those callbacks occur on the main application thread, and this gives you flexibility to move those to background threads.
Settings.ACTION_APP_NOTIFICATION_BUBBLE_SETTINGSlets you bring up a screen to let the user control your bubbles, if your app uses them.
Settings.Panel.ACTION_WIFIbrings us another settings panel to display. It is not completely clear what the decision criteria would be for showing this instead of
What Is Mystifying?
There are a set of methods around a “synthetic app details activity” in
It is not completely clear what this is referring to.
Similarly, there are methods and constants about “whitelist restricted permissions”; the documentation for these is rather confusing.
What Else Was Eyebrow-Raising From Day 1 of Google I|O?
I thought that the point of Jetpack was to streamline the number of recommended options for various app development topics. So, I was surprised to hear about Jetpack Compose and Jetpack ViewBindings. Partly, that is because they seem to be mutually contradictory: ViewBindings is for layout resources, and Jetpack Compose seems to aim to eliminate layout resources. Partly, that is because they both seem to step on data binding’s toes, and data binding is part of Jetpack. I can see where ViewBindings and data binding might coexist: use ViewBindings when all you want is generated widget references, but use data binding when you want that plus binding expressions. I hope that Google delivers a consistent and coherent story about when you use which of these technologies… and whether data binding is headed for the chopping block.
Otherwise, the major presentations (Google keynote, developer keynote, and Android keynote) seemed to hold few surprises for Android developers.
I plan to release another update to Elements of Android Q next week, to update what I have for Beta 3 and to add more coverage of the changes to location access.
And, as I mentioned earlier, I will write more about the scoped storage changes later this week.