App Widgets as Wearable UIs

Since I am attending Wearables DevCon here in scenic Burlingame (no, really, the walk along the bay is quite nice), this seemed to be a good time for me to grouse about developing apps for wearables.

Wearable devices — particularly those that run Android — are not that hard to adapt an Android app to. You have to deal with different screen sizes, be more judicious about your gestures, worry all the more about power consumption, and so on. But, at the end of the day, it’s still an Android app.

Where things start to get messy is with wearable accessories, where your app does not run on the wearable itself, but instead runs on a tethered phone or tablet. You use some manufacturer-supplied API to render a GUI and respond to touch input.

The reason why this is messy is two-fold:

  1. The people who design those manufacturer-supplied APIs do not necessarily think about what is going to garner the most adoption of those APIs

  2. Each manufacturer rolls their own, meaning that developers have several such APIs to mess with, if not today, then tomorrow

Now, in some cases, the wearable really does need a distinctive API, because the UI you would want to create is itself distinctive. Augmented spectacles, like Google Glass, would be one such example. However, writing an app for one wrist wearable should not be radically different than writing an app for another wrist wearable, at least for basic apps.

What makes this all the more grating is that wearable manufacturers have a perfectly good API already in Android that they could be using: app widgets.

A home screen app widget has three characteristics in common with wrist wearables UIs:

  • The UI is small

  • The UI does not support much in the way of touch events (taps, plus swipes on select widgets like ListView)

  • The information shown there is stuff that is supposed to be for ready access at any point

App widgets have been around since 2009. There are tens of thousands of Android apps out on the Play Store with app widgets, and many (if not most) of them would be right at home as an app on a wearable.

For a wearable accessory, the mediation app supplied by the manufacturer that runs on the phone or tablet would serve as an AppWidgetHost. Rather than render the app widgets on the phone or tablet screen, though, it would render them to a Bitmap-backed Canvas, sending the bitmap over to the wearable for display. Touch events would be handled either by forwarding them along to the View hierarchy of the app widget (e.g., for swipes on a ListView) or by triggering the PendingIntent associated with a tap. From the standpoint of the app widget developer, everything would look just as if the device’s home screen were the one hosting the app widget.

And that’s the point.

Every line of code needed custom for a wearable is a barrier to adoption. Developers are not necessarily going to run and jump on every wearable that comes down the pike. By using an existing framework, one that is (reasonably) well-documented and in use elsewhere, wearables can get more app support more quickly with less proprietary stuff.

Now, the app widget API was not designed with wearables in mind, in all likelihood. There may be things that the wearable can do that go beyond the app widget. How much of that could be shoehorned into an extended version of the app widget framework (e.g., WearableWidgetProvider subclass of AppWidgetProvider supporting additional broadcasts and methods) would depend up on the wearable. And offering a complete proprietary API as an alternative to using app widgets, for developers who want to deeply integrate with the wearable, is similarly reasonable.

Wearables are still a fringe technology in 2014, compared to their phone and tablet brethren. If wearable manufacturers want a lot of apps quickly, using app widgets would seem to be the route to go. The wearable needs to adapt to the developers, not the other way around. Or, as the saying goes, “If the mountain won’t come to Muhammad then Muhammad must go to the mountain.”