The CommonsBlog

Uncomfortable Questions About App Signing

Dear Google Play Team:

Recently, you stated:

we intend to require new apps and games to publish with the Android App Bundle on Google Play in the second half of 2021

(emphasis yours)

To publish an App Bundle, we must use App Signing:

it is a requirement to use Play App Signing in order to publish with App Bundles on Google Play.

This gives you signing authority over the APKs that are delivered to people. As far as I can tell, this means that you can do whatever you want with the contents of those APKs, including adding to and replacing the original code supplied by the app’s developers. Worse, this requirement for new apps feels like a trial run for eventually requiring all developers to opt into App Signing.

Given that… we need to talk.

Some people may be worried about you modifying apps for your own benefit, such as degrading performance for apps that compete with Google properties. That does not keep me awake at night.

This does:

A country ruled by a repressive regime is oppressing a minority group. The leadership of that regime tells Google: “In order to do business in our country, you need to agree to distribute altered versions of certain apps to people of our choosing. We will supply you with the altered apps, and we will supply you with the identities or criteria for choosing the people who should receive these altered apps. That criteria might be narrow or wide, up to and including all people within our country.” They then supply Google with the identities of leaders of the minority group, along with altered versions of next-generation end-to-end (E2E) encrypted messaging apps. These altered apps capture communications by grabbing the data out of the app’s UI directly and sending them to a regime-controlled server, bypassing the encryption. Google gives the regime each version of the targeted apps as they get released and postpones those releases on the Play Store, under the guise of the app review process, so the regime can make its changes to those apps. Google signs those altered apps — no different than if the apps came from the original developers — and distributes those altered apps to the designated victims.

You can supply your own values for the country and the minority group. There are many options to choose from.

App Signing, and your upcoming moves to make it mandatory for new apps, opens you up to this sort of coercion. Right now, you have “plausible deniability” — you can claim that certain apps simply did not opt into App Signing. Your new requirement removes that defense for new apps after your chosen date for that requirement. If you eventually force all developers to opt into App Signing, then this form of coercion seems inevitable.

Project Dragonfly, is a depressing demonstration of executive intent. However, there seemed to be some internal push-back within your firm to Project Dragonfly, at least among engineers.

I am hopeful that those of you who are being forced to implement this new policy have a plan for how to block this sort of coercion or otherwise prevent my scenario from playing out. It would be good if we knew what that plan was, as it is much easier for us to help with plans that way.

So… what’s the plan?

Right now, publicly, the plan appears to be to claim that this just will not happen.

For example, in a Medium post about app signing, a Google developer advocate wrote:

we don’t modify and distribute your application code without your knowledge and approval


As stated before, Play will not modify the functionality of your application without your knowledge and approval.

Notably, this person used “don’t” and “will not”… as opposed to “can’t” and “cannot”.

However, you readily admit that you modify the app from what we give you as the App Bundle. For example, that same Medium post includes:

For apps uploaded as app bundles, we will improve this security by introducing what is called a source stamp. This source metadata is inserted into the app’s manifest by bundletool.

That sort of metadata change apparently has been going on for a couple of years.

Besides, Amazon has been doing this sort of thing for the better part of a decade:

Amazon wraps your app with code that enables the app to communicate with the Amazon Appstore client to collect analytics, evaluate and enforce program policies, and share aggregated information with you. Your app will always communicate with the Amazon Appstore client when it starts [To do this], Amazon removes your signature and re-signs your app with an Amazon signature that is unique to you, does not change, and is the same for all apps in your account.

You admit that you can do the same sort of re-signing for apps distributed via App Signing, even those shipped as APKs:

Google verifies and strips your signature from the APK, and then resigns the APK with the app signing key

So, it seems like “don’t” instead of “can’t” represents a policy of forbearance. You appear to be capable of doing much more, but you are deciding not to do so at the present time.

However, policies can change, at any time, for any reason, without warning. Or, as some guy in a dark helmet once said:

I am altering the deal. Pray I don’t alter it any further.

So, one possible plan is for you to have teeth behind the “don’t modify and distribute your application code without your knowledge and approval” claim.


  • Is your claim that modifying the code is impossible? It seems like it is possible, but perhaps I am missing something. I would love to see a technical explanation of how you would be unable to change the bytecode or resource content prior to signing the APKs.

  • Are you pursuing legislation to ban this practice? Perhaps you are working with lawmakers in major countries to ban this sort of behavior, with suitable penalties for firms (and their executives) that participate in it. That might cover more than just app modifications — it might also ban modifications made to Web content in transit by ISPs, such as the various “super-cookie” sorts of changes that crop up from time to time. If this is your plan, it would be nice if a wide range of participants, including those from organizations advocating for civil liberties, would be involved.

Perhaps the plan is that you really will not require App Signing for App Bundles. After all, just because one of your staff members said that App Bundles require App Signing does not mean that this requirement will hold indefinitely. Perhaps it will be relaxed before the App Bundle mandate comes into play. Or, perhaps the App Bundle mandate itself might be relaxed.

After all, App Bundles are not a technical requirement. We have been distributing APKs for over a decade, and in that time, the dead have not risen, kaiju have not laid waste to major cities, and aliens have not invaded.

(then again, it is 2020…)

Also, App Bundles would not appear to be necessary to achieve your technical objectives. Your goal is to deliver the right subsets of an app to a device based on device characteristics. To do that, you generate a sliced-up app out of an App Bundle and deliver a few APKs that, when combined, implement the app without extraneous resources and stuff. We know this because bundletool lets us generate those APKs, and we can even install those APKs ourselves to demonstrate that they work. bundletool will even sign them with our signing key.

That means that we do not need to send you an App Bundle for you get the benefits of one. We could send you the signed APKs created by bundletool instead. For example, we could ZIP those up into an “App Assortment” and upload that to the developer console instead of the App Bundle. You appear to get everything that you need to achieve your technical objectives, just without the ability to sign. Instead, we sign the APKs, just as we do today, albeit perhaps via bundletool.

This is not to say that you should stop offering App Signing, just that it should be a choice, as it has been since you introduced it.

Perhaps something along these lines is how you intend to avoid coercion — you can claim that any app that regimes want to alter is still being signed by the app’s developers, not you. So:

  • Will you extend App Bundles to allow for developer-signed artifacts and no App Signing? This seems to be fairly trivial, given the existing capabilities of bundletool. But, if there is some reason why bundletool output that seems to work does not actually work… that would be good to know, with adequate details, of course.

  • Will you rescind the App Bundle requirement, and allow for developer-signed APKs indefinitely? After all, you are the one announcing that App Bundles are going to be required. You can change your mind on that.

Perhaps you can modify apps prior to signing, and the reason that you are requiring signing authority is because you think that you might want to modify apps in the future, even though you claim that you are not doing so right now.

Maybe your plan is that we would detect any nefarious modifications, should you ship some.

If so, I would be interested to know how you would like us to go about doing that.

Technically, the regime-hack scenario could happen today, even with developer-signed APKs. However, in that case, you cannot sign the altered app versions. So, developers who monitor the signing keys used for their apps would be able to detect the fact that you are distributing an altered app. Plus, the risk would be limited to new installs of the app — any user with an existing app installation with the developer’s signed app would refuse to install the altered app, as the signing keys would not match. Admittedly, we could do a lot more here to monitor for this sort of attack, but we get a lot of protections “for free”, just from having developers sign the apps. And a regime has a much more difficult time coercing an app developer, because that developer does not distribute the app, so everybody has to get the alterations, making them more likely to be discovered.

But, with App Signing, we cannot just check an app’s signature. The signature will be valid; the contents of the APKs are what is in question, because you can modify them based on the demands applied via the coercion.

Perhaps you are counting on the new App Bundle explorer:

You can download and attest the exact APKs that Play generates for delivery

The problem is that we have no way to know if these are the actual APKs that Play will distribute. After all, you are perfectly capable of distributing one set of APKs through the explorer and another to Android device owners.

Perhaps your vision is that we would use hashes or something to confirm that the APKs delivered to people match those from the explorer. The implication is that it would be up to us developers to:

  • Confirm that the explorer APKs match bundletool output, as we can see what bundletool does and confirm that it is not doing anything malicious; then

  • Publish hashes of the bundletool/explorer output in such a way that security apps (and perhaps our own apps) could validate that the installed APKs are valid; and

  • Ensure that those hashes themselves are not able to be modified by some attacker

That is not out of the question. However, it is a fair bit of infrastructure.

More importantly, though, it gets back to the root concern outlined in section of this letter: if your plan is that you want to modify the apps, then presumably someday you will do just that. So now we’re in a situation where bundletool output intentionally does not match what you are shipping to people. Detecting changes would be insufficient — we would need to determine (somehow) whether those changes are malicious or not, and do this on a massive scale (one that precludes manual inspection).

So, how are you expecting us to “attest” these APKs, for real? What is the process by which you are expecting us to confirm that, while you have made changes to the APKs, that those changes are beneficial and not harmful? And how do you expect us to do this for everyone who is using the app, given that you could modify the app for some people and not for others?

When App Signing came out, I was concerned for the reasons that I outlined in this letter. However, it was opt-in, and so while I would quietly steer developers away from it, that is all that I did. Now that you are making it mandatory for some apps — and appear have the ability to make it mandatory for all in the future — I think that it is time that we figure out how to minimize the risk to the ~2.5 billion Android device owners.

So… what’s the plan?

Sincerely Yours,

Mark Murphy (a Commons Guy)

Sep 23, 2020

"Elements of Android Room" Version 0.3 Released

Subscribers now have access to Version 0.3 of Elements of Android Room, in PDF, EPUB, and MOBI/Kindle formats. Just log into your Warescription page to download it, or set up an account and subscribe!

This update adds three more chapters, covering:

In addition:

  • The PagedFTS sample was updated to take into account the option for packaging a database in your app

  • A bunch of dependencies were updated, notably Room itself

  • Various bugs were fixed

Sep 21, 2020

App Security at Android Summit 2020!

I’ll be presenting as part of Android Summit 2020, which has moved from the Washington DC area to the Web due to the pandemic, as have so many other events.

This year, I wanted to return to a subject that I have presented on many times before: app security. Generally, this subject is a bit of a backwater — Android experts are orders of magnitude more likely to write or talk about animations than security.

However, in the 2020 edition of the Android Summit, it seems like there will be a few presentations on security. This is a welcome change! However, it means that my planned “survey of app security” talk isn’t a great fit.

So, I’m going to be tweaking it slightly, focusing on modern app development and where security comes into play:

  • How (and when!) do we think about security in an agile/SCRUM world?

  • Where can security flaws come from? (hint: it’s not just your code)

  • How do we automate security checks the way that we automate our testing?

  • And so on

I’ll be using concrete examples of problems along the way to illustrate these sort of “AppSecOps” or “DevSecOps” topics.

The Android Summit is being held October 8-9. I do not know right now exactly when I will be speaking, other than it will be in the US East Coast afternoon or early evening. This is OK for the Western Hemisphere; if you are elsewhere, you may wind up needing to wait for the conference video to be published.

But, at the same time, since this is being held online, anyone can attend, not just those who are in position to get to DC! Visit for details on getting tickets and, in the coming weeks, the complete event schedule.

I look forward to seeing you (virtually) there!

Sep 16, 2020

Android R One-Time Permission Problem Really an Android Studio Problem

Last month, I wrote about an apparent bug in Android R, where one-time permission expiration sometimes kills alarms and jobs.

Google investigated and concluded that it is really an Android Studio problem. I can confirm their basic findings — the problem does not show up if you install and run the sample app from the command line, for example.

What is supposed to happen when a one-time permission is revoked is that the app’s process is terminated. This is in line with how other permission revocations are handled. Other than the process termination happening relatively rapidly after the app leaves the foreground, there is nothing unusual here.

Android Studio, though, does not feel that this is punitive enough. Studio wants to teach that app who is in charge. So, Studio uses adb to force-stop the app (somehow…) when it detects that a process that Studio started was terminated.

I do not know why Android Studio feels that it needs to be so mean.

Regardless, this means that the original problem should only happen during development, not in production. This may complicate app development work, but it should not harm users. While this is not ideal, it is much better than what I was fearing.

Many thanks to Nicole Borelli and the rest of the Google team that investigated the problem!

Sep 13, 2020

Getting Android Studio 4.2 Canary To Run Again

If you are like me, when Android Studio 4.2 Canary 8 came out, it would crash on startup with:

OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
Error occurred during initialization of VM
Multiple garbage collectors selected

This has been reported for Ubuntu, macOS, Arch Linux, and (albeit with an incorrect title) Chrome OS.

Thanks to this comment, it is more clear what is going on.

When Android Studio 4.2 runs for the first time, it creates a $HOME/.config/Google/AndroidStudioPreview4.2/ directory and dumps a bunch of files and directories in there. One is studio64.vmoptions, which contains a bunch of java command-line options that presumably get applied when launching Studio via

One of those, -XX:+UseConcMarkSweepGC, seems to be what is triggering this crash.

So, see if you have that file — if you are getting this crash, you should. Just edit the file, remove that line, and save your changes. I was able to get Canary 9 to start after that, and while I did not test Canary 8, my guess is that this fix will handle it as well.

There may be a better fix than removing this line, and ideally we would not have to manually edit this file. But, this is better than nothing.

Sep 05, 2020

Older Posts