The CommonsBlog


Random Musings on the Android 17 Beta 1

Last time around, with Android 16, we had two developer previews before a beta. This time, not so much. We went straight to Beta 1 for Android 17. Not only did we lose months of developer preview time, but Beta 1 came out three weeks later than did Android 16 Beta 1.

The good news is: there is not all that much in this release. The bad news is: the point of the release seems to be to break apps. Whether it’s immediately or a year out when the Android 17 targetSdk becomes required, there is quite a bit in this release that might screw up your app. While things like adaptive screens and background audio hardening get all the attention:

There is a fair bit of documentation, but it is a bit disjointed. For example, the Android 17 Beta 1 announcement blog post mentions a change to the size of custom notification views that takes effect when you move to target Android 17… but the page on behavior changes for targeting Android 17 does not mention it.

Other documentation bugs include:

  • The API differences report claims a lot of changes were made in API Level 1 when that does not appear to be the case

  • They claim to have added a new DEVICE_PROFILE_FITNESS_TRACKER companion request, but that seems to have been around for years

  • While some places in the docs indicate that Android 17 will result in an API version of 37, note that Build.VERSION_CODES.CINNAMON_BUN is 10000 for Beta 1, continuing a long-standing tradition of using high temporary values for new version codes until the API changes stabilize

(though I do appreciate the choice of cinnamon bun as the tasty treat for this release! 😋)

My usual random musings focus on changes in the API differences report that do not seem to be covered elsewhere. Android 17 Beta 1 is a tiny release in terms of API changes, so there is not much to report:

  • FINGERPRINT_SERVICE is being officially removed from Context. Since that system service has not been around in a while, hopefully this will not affect you.

  • There is a new STREAM_ASSISTANT volume level defined in AudioManager

  • There is a new ACTION_SUPERVISION_SETTINGS in Settings. This action string may “show screen to manage supervision settings”. While the documentation does not state this, please assume that any given device might not offer this screen and plan accordingly. Personally, I worry about the potential ramifications of Android getting capabilities like X-ray vision that would be enabled in a super-vision screen.

  • Wait… I am now receiving word that “supervision settings” may not refer to actual super-vision powers. Carry on!

  • startSafeBrowsing() on WebView is now deprecated. Unsafe browsing presumably is still supported.

Feb 14, 2026


11 February 2026 Artifact Wave

There were only two new artifacts this week:

  • androidx.media3:media3-effect-lottie
  • androidx.media3:media3-inspector-frame

The roster of 600+ updated artifacts can be found here!

Feb 11, 2026


Write for Your Readers... Even If They Are Agents

Coding agents are here to stay. That may be a double-edged sword, but even if the “AI bubble” pops, agents will still be available. Personally, I am starting to experiment with locally-run models, like Devstral Small 2, to see how well they work in the role. Of course, plenty are using Claude Code and OpenAI Codex and JetBrains Junie and so on, whether tied to hosted frontier models or others.

If coding agents are here to stay, we need to look at adapting our ecosystem to accommodate them.

Historically, our documentation has been designed around human readers. The bigger the project, the more likely it is that is has nicely-formatted documentation, spread over many pages. Or the project uses a Wiki, CMS, or other tool for documentation. Heck, some crazy people even write books about programming, distributed as PDFs and EPUBs and the like.

LLMs like text. Plain text. Maybe a bit of Markdown for spice. But, in the end, it’s text, text, text.

Some LLMs will include some projects’ documentation as part of a training set. That’s all well and good, but the LLM’s knowledge base (such as it is) will only be as up to date as the training set that was used to train the model. Pointing an agent to ordinary well-formatted documentation, in the hopes that it will understand and incorporate it, is a hit-or-miss proposition, as the LLM needs to fight through all of the formatting. Formatting is important for human readers, but it is cruft to an agent.

If your project’s documentation is already in Markdown or other forms of plain text, consider ensuring that the documentation is discoverable in raw Markdown, as opposed to having been rendered by MkDocs or GitHub Pages or whatever. Linking to that from the project README, perhaps with some notes for agents to help them find it, may help the agents leverage your library or tool.

Better yet is to create something designed specifically for agents. Some folks are experimenting with custom text files, for example.

Personally, I expect that skills will be the better option, at least in the near term. Many agents already support skills, either from locally-installed files or via some form of import mechanism (e.g., Claude Code’s plugin system).

For example, Ben Oberkfell recently published a plugin that packages up some skills for popular Jetpack Compose libraries like Showkase and Haze, plus Kotlin Multiplatform libraries like Metro. Whether it’s a one-line git clone or a one-line /plugin command in Claude Code, developers can add these skills that their agents can use.

My guess is that there will be fairly rapid iteration and “ideation” in this area, so if you want to sit tight for a bit and “wait for the dust to settle”, that’s reasonable. But, one way or another, my guess is that we need to start thinking about how agents will learn how to use our stuff, not just ordinary human developers.

Feb 06, 2026


29 January 2026 Artifact Wave

Well, after a one-day delay, the rest of this wave’s artifacts made it into Gradle!

Five brand-new artifacts showed up, four for Tracing and one for XR:

  • androidx.tracing:tracing-desktop
  • androidx.tracing:tracing-wire
  • androidx.tracing:tracing-wire-android
  • androidx.tracing:tracing-wire-desktop
  • androidx.xr.projected:projected-binding

The roster of 700+ updated artifacts can be found here!

Jan 29, 2026


28 January 2026 Artifact Wave

While Google claims a lot of stuff was released, somebody forgot to put them into Maven, as maven.google.com does not know about them. 🤷🏻

What we do have is a patch release to Media3:

  • androidx.media3:media3-cast:1.9.1
  • androidx.media3:media3-common:1.9.1
  • androidx.media3:media3-common-ktx:1.9.1
  • androidx.media3:media3-container:1.9.1
  • androidx.media3:media3-database:1.9.1
  • androidx.media3:media3-datasource:1.9.1
  • androidx.media3:media3-datasource-cronet:1.9.1
  • androidx.media3:media3-datasource-okhttp:1.9.1
  • androidx.media3:media3-datasource-rtmp:1.9.1
  • androidx.media3:media3-decoder:1.9.1
  • androidx.media3:media3-effect:1.9.1
  • androidx.media3:media3-exoplayer:1.9.1
  • androidx.media3:media3-exoplayer-dash:1.9.1
  • androidx.media3:media3-exoplayer-hls:1.9.1
  • androidx.media3:media3-exoplayer-ima:1.9.1
  • androidx.media3:media3-exoplayer-midi:1.9.1
  • androidx.media3:media3-exoplayer-rtsp:1.9.1
  • androidx.media3:media3-exoplayer-smoothstreaming:1.9.1
  • androidx.media3:media3-exoplayer-workmanager:1.9.1
  • androidx.media3:media3-extractor:1.9.1
  • androidx.media3:media3-inspector:1.9.1
  • androidx.media3:media3-muxer:1.9.1
  • androidx.media3:media3-session:1.9.1
  • androidx.media3:media3-test-utils:1.9.1
  • androidx.media3:media3-test-utils-robolectric:1.9.1
  • androidx.media3:media3-transformer:1.9.1
  • androidx.media3:media3-ui:1.9.1
  • androidx.media3:media3-ui-compose:1.9.1
  • androidx.media3:media3-ui-compose-material3:1.9.1
  • androidx.media3:media3-ui-leanback:1.9.1

Jan 28, 2026


Older Posts