The following is the first few sections of a chapter from The Busy Coder's Guide to Android Development, plus headings for the remaining major sections, to give you an idea about the content of the chapter.
As developers, we are very used to creating apps that are designed to be navigated by touch, with users tapping on widgets and related windows to supply input.
However, not all Android devices have touchscreens, and not all Android users use touchscreens.
Internationalization (i18n) and localization (L10n) give you opportunities to expand your user base to audiences beyond your initial set, based on language. Similarly, you can expand your user base by offering support for non-touchscreen input and output. Long-term, the largest user base of these features may be those with televisions augmented by Android, whether via Android TV, OUYA consoles, or whatever. Short-term, the largest user base of these features may be those for whom touchscreens are rarely a great option, such as the blind. Supporting those with unusual requirements for input and output is called accessibility (a11y), and represents a powerful way for you to help your app distinguish itself from competitors.
In this chapter, we will first examine how to better handle focus management, and then segue into examining what else, beyond supporting keyboard-based input, can be done in the area of accessibility.
Understanding this chapter requires that you have read the core chapters and are familiar with the concept of widgets having focus for user input.
The preview of this section will not appear here for a while, due to a time machine mishap.
The preview of this section was fed to a gremlin, after midnight.
The preview of this section is in an invisible, microscopic font.
The preview of this section is being chased by zombies.
The preview of this section was accidentally identified as an Android 'tasty treat' by the Cookie Monster.