This is an archived snapshot of W3C's public bugzilla bug tracker, decommissioned in April 2019. Please see the home page for more details.
Accessibility on touch interfaces typically distinguish between buttons and keys. For example in Android and iOS, when the user lands on a button via explore by touch, they double tap to activate. Keys provide quicker more efficient access for prolonged use, and allow the user to simply raise their finger from the display once a key is reached to activate. Keys can be found in the on-screen keyboard, and in the keypad in a phone's dialer app. I propose introducing another ARIA role for keys, called "key". It would be a subclass of either "button", or "command" directly. There could be many other ways to implement this, using other attributes and flags on the button elements, like touchtype="true". I think introducing a new role for this makes sense because other ATs with different input models could interpret the role to make activation a bit easier than typical buttons. For example, a switch AT could have shorter dwell times for keys. Or a braille display could choose to have a special mode and markup for keys. You can read the rest of the discussion at: https://bugzilla.mozilla.org/show_bug.cgi?id=808596
I like this idea. I think it should subclass "button" for maximum backwards compatibility. This role already has a native mapping on iOS (UIAccessibilityTraitKeyboardKey).
https://github.com/w3c/aria/issues/730