RTC Accessibility User Requirements

W3C Working Draft

This version:
https://www.w3.org/TR/2020/WD-raur-20201207/
Latest published version:
https://www.w3.org/TR/raur/
Latest editor's draft:
https://w3c.github.io/apa/raur/
Previous version:
https://www.w3.org/TR/2020/WD-raur-20200319/
Editors:
(W3C)
(W3C)
Participate:
GitHub w3c/apa
File a bug
Commit history
Pull requests

Abstract

This document outlines various accessibility related user needs, requirements and scenarios for Real-time communication (RTC). These user needs should drive accessibility requirements in various related specifications and the overall architecture that enables it. It first introduces a definition of RTC as used throughout the document and outlines how RTC accessibility can support the needs of people with disabilities. It defines the term 'user needs' in the context of this document and then goes on to list a range of these user needs and related requirements. Following that some quality related scenarios are outlined and finally a data table that maps the user needs contained in this document to related use case requirements found in other technical specifications.

This document is most explicitly not a collection of baseline requirements. It is also important to note that some of the requirements may be implemented at a system or platform level, and some may be authoring requirements.

Status of This Document

This section describes the status of this document at the time of its publication. Other documents may supersede this document. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at https://www.w3.org/TR/.

This is an updated draft of RTC Accessibility User Requirements by the Accessible Platform Architectures Working Group. It is developed by the Research Questions Task Force (RQTF) who work to identify accessibility knowledge gaps and barriers in emerging and future web technologies. The requirements outlined here come from research into user needs that then provide the basis for any technical requirements. This version includes updates based on public feedback to the First Public Working Draft published 19 March 2020.

To comment, file an issue in the W3C APA GitHub repository adding the RAUR label. If this is not feasible, send email to public-apa@w3.org (archives). Comments are requested by 31 January 2021. In-progress updates to the document may be viewed in the publicly visible editors' draft.

This document was published by the Accessible Platform Architectures Working Group as a Working Draft.

GitHub Issues are preferred for discussion of this specification.

Publication as a Working Draft does not imply endorsement by the W3C Membership.

This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.

This document was produced by a group operating under the 1 August 2017 W3C Patent Policy. The group does not expect this document to become a W3C Recommendation. W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.

This document is governed by the 15 September 2020 W3C Process Document.

Introduction

What is Real-time communication (RTC)?

The traditional data exchange model is client to server. Real-time communication (RTC) is game-changing as it is enabled in part by specifications like WebRTC that provides real-time peer to peer audio, video and data exchange directly between supported user agents. This enables instantaneous applications for video and audio calls, text chat, file exchange, screen sharing and gaming, all without the need for browser plugins. However, WebRTC is not the sole specification with responsibility to enable accessible real-time communications, as use cases and requirements are broad - as outlined in the IETF RFC 7478 'Web Real-Time Communication Use Cases and Requirements' document. [ietf-rtc]

1. Real-time communication and accessibility

RTC has the potential to allow improved accessibility features that will support a broad range of user needs for people with a wide range of disabilities. These needs can be met through improved audio and video quality, audio routing, captioning, improved live transcription, transfer of alternate formats such as sign-language, text-messaging / chat, real time user support and status polling.

RTC accessibility is enabled by a combination of technologies and specifications such as those from the Media Working Group, Web and Networks IG, Second Screen, and Web Audio Working group as well as AGWG and ARIA. APA hopes this document will inform how these groups meet various responsibilities for enabling accessible RTC, as well updating related use cases in various groups. For examples, view the current work on WebRTC Next Version Use Cases First Public Working Draft. [webrtc-use-cases]

2. User needs definition

This document outlines various accessibility related user needs for RTC accessibility. The term 'user needs' in this document relates to what people with various disabilities need to successfully use RTC applications. These needs may relate to having particular supports in an application, being able to complete tasks or access other functions. These user needs should drive accessibility requirements for RTC accessibility and its related architecture.

User needs are presented here with their related requirements; some in a range of scenarios (which can be thought of as similar to user stories).

3. User needs and requirements

The following outlines a range of user needs and requirements. The user needs have also been compared to existing use cases for Real-time text (RTT) such as the IETF Framework for Real-time text over IP Using the IETF Session Initiation Protocol RFC 5194 and the European Procurement Standard EN 301 549. [rtt-sip] [EN301-549]

3.1 Window anchoring and pinning

3.2 Pause 'on record' captioning in RTC

3.3 Accessibility user preferences and profiles

3.4 Incoming calls and caller ID

3.5 Routing and communication channel control

3.6 Dynamic audio description values in live conferencing

3.7 Quality synchronisation and playback

3.8 Simultaneous voice, text & signing

Note

This user need may also indicate necessary support for 'Total conversation' services as defined by ITU in WebRTC applications. These are combinations of voice, video, and real-time text (RTT) in the same real-time session. [total-conversation]

3.9 Emergency calls: Support for Real-time text (RTT)

3.10 Video relay services (VRS) and remote interpretation (VRI)

3.11 Distinguishing sent and received text with RTT

3.12 Call participants and status

3.13 Live transcription and captioning support

3.14 Assistance for users with cognitive disabilities

3.15 Personalized symbol sets for users with cognitive disabilities

Note

This relates to cognitive accessibility requirements. For related work at W3C see the 'Personalization Semantics Content Module 1.0' and 'Media Queries Level 5'. [personalization] [media-queries]

3.16 Internet relay chat (IRC) style interfaces required by blind users

Note

Some braille users will also prefer the RTT model. However, braille users desiring text displayed with standard contracted braille might better be served in the manner users relying on text to speech (TTS) engines are served, by buffering the data to be transmitted until an end of line character is reached.

4. Relationship between RTC and XR Accessibility

There are potential real-time communication application issues that may only apply in immersive environments or augmented reality contexts.

For example, if an RTC application is also an XR application then relevant XR accessibility requirements should be addressed as well. [xaur]

5. Quality of service scenarios

5.1 Deaf users: Video resolution and frame rates

Scenario: A deaf user watching a signed broadcast needs a high-quality frame rate to maintain legibility and clarity in order to understand what is being signed.

Note

EN 301 549 Section 6, recommends WebRTC applications should support a frame rate of at least 20 frames per second (FPS). More details can be found at Accessible Procurement standard for ICT products and services EN 301 549 (PDF)

5.2 Bandwidth for audio

Scenario: A hard of hearing user needs better stereo sound to have a quality experience in work calls or meetings with friends or family. Transmission aspects, such as decibel range for audio needs to be of high-quality. For calls, industry allows higher audio resolution but still mostly in mono only.

5.3 Bandwidth for video

Scenario: A hard of hearing user needs better stereo sound so they can have a quality experience in watching HD video or having a HD meeting with friends or family. Transmission aspects, such as frames per minute for video quality needs to be of high-quality.

Note

EN 301 549 Section 6, recommends for WebRTC enabled conferencing and communication the application shall be able to encode and decode communication with a frequency range with an upper limit of at least 7KHz. More details can be found at Accessible Procurement standard for ICT products and services EN 301 549 (PDF)

Note

WebRTC lets applications prioritise bandwidth dedicated to audio / video / data streams; there is also some experimental work in signalling these needs to the network layer as well as support for prioritising frame rate over resolution in case of congestion. [webrtc-priority]

A. Change Log

The following is a list of new user needs in this document:

The following is a list of updated requirements to existing user needs:

The following are other changes in this document:

Note

This user need may also indicate necessary support for 'Total conversation' services as defined by ITU in WebRTC applications. These are combinations of voice, video, and real-time text (RTT) in the same real-time session. [total-conversation]

This document has been updated based on document feedback, discussion and Research Questions Task Force consensus.

B. Acknowledgments

B.1 Participants of the APA working group active in the development of this document:

B.2 Enabling funders

This work is supported by the EC-funded WAI-Guide Project.

C. References

C.1 Informative references

[EN301-549]
Accessibility requirements suitable for public procurement of ICT products and services in Europe. CEN/CENELEC/ETSI. August 2018. URL: http://mandate376.standards.eu/standard
[ietf-relay]
Interoperability Profile for Relay User Equipment. IETF. August 2020. URL: https://tools.ietf.org/html/draft-ietf-rum-rue-02.html
[ietf-rtc]
Web Real-Time Communication Use Cases and Requirements. IETF. March 2015. URL: https://tools.ietf.org/html/rfc7478
[media-queries]
Media Queries Level 5. W3C. 31 July 2020. URL: https://www.w3.org/TR/mediaqueries-5/
[personalization]
Personalization Semantics Content Module 1.0. W3C. 27 January 2020. URL: https://www.w3.org/TR/personalization-semantics-content-1.0/
[rtt-sip]
Framework for Real-Time Text over IP Using the Session Initiation Protocol (SIP). IETF, Network Working Group. June 2008. URL: https://tools.ietf.org/html/rfc5194
[total-conversation]
ITU-T SG 16 Work on Accessibility - Total Conversation. International Telecommunication Union (ITU). 2020. URL: https://www.itu.int/en/ITU-T/studygroups/com16/accessibility/Pages/conversation.aspx
[webrtc-priority]
WebRTC DSCP Control API. W3C. 12 February 2020. URL: https://w3c.github.io/webrtc-priority/
[webrtc-use-cases]
WebRTC Next Version Use Cases. W3C. 11 December 2018. URL: https://www.w3.org/TR/webrtc-nv-use-cases/
[xaur]
XR Accessibility User Requirements. W3C. 16 Sept 2020. URL: https://www.w3.org/TR/xaur/