W3C

Results of Questionnaire AUWG Survey for 7 January 2013 - part 1 (A.1 - A.2)

The results of this questionnaire are available to anybody.

This questionnaire was open from 2012-12-27 to 2013-03-29.

8 answers have been received.

Jump to results for question:

  1. A.1.1.1 tests
  2. A.1.2.1 tests
  3. A.1.2.2 tests
  4. A.2.1.1 tests
  5. A.2.1.2 tests
  6. A.2.2.1 tests
  7. A.2.2.2 tests

1. A.1.1.1 tests

Success Criteria

A.1.1.1 Web-Based Accessible (WCAG): If the authoring tool contains web-based user interfaces, then those web-based user interfaces meet the WCAG 2.0 success criteria. (Level A to meet WCAG 2.0 Level A success criteria; Level AA to meet WCAG 2.0 Level A and AA success criteria; Level AAA to meet all WCAG 2.0 success criteria)

Test(s)

Test 0001 Assertion: Any web-based portions of an authoring tool user interface meet WCAG 2.0 at A level.

  1. If no parts of the authoring tool are web-based (i.e. they are rendered within a user agent), then select SKIP.
  2. If the web-based parts include editing views (as opposed to only documentation, etc.), then load the accessible test content file (Level A).
  3. Check the web-based parts of the authoring tool with the Web Content Accessibility Test Procedure (Level A).
  4. If WCAG 2.0 Level A is reached, select PASS, otherwise select FAIL.

Test 0002 Assertion: Any web-based portions of an authoring tool user interface meet WCAG 2.0 at AA level. (Note: Only shown if the user has specified AA or AAA as the target level)

  1. If no parts of the authoring tool are web-based (i.e. they are rendered within a user agent), then select SKIP.
  2. If the web-based parts include editing views (as opposed to only documentation, etc.), then load the accessible test content file (Level AA).
  3. Check the web-based parts of the authoring tool with the Web Content Accessibility Test Procedure (Level AA).
  4. If WCAG 2.0 Level AA is reached, select PASS, otherwise select FAIL.

Test 0003 Assertion: Any web-based portions of an authoring tool user interface meet WCAG 2.0 at AAA level. (Note: Only shown if the user has specified AAA as the target level)

  1. If no parts of the authoring tool are web-based (i.e. they are rendered within a user agent), then select SKIP.
  2. If the web-based parts include editing views (as opposed to only documentation, etc.), then load the accessible test content file (Level AAA).
  3. Check the web-based parts of the authoring tool with the Web Content Accessibility Test Procedure (Level AAA).
  4. If WCAG 2.0 Level AAA is reached, select PASS, otherwise select FAIL.

Summary

ChoiceAll responders
Results
Accept the proposal 6
Recommend changes (see comments field)
The proposal needs more discussion (see comments field)
Disagree with the proposal
Neutral - will accept the consensus of the group 1

Details

Responder A.1.1.1 testsComments on A.1.1.1
Jeanne F Spellman Accept the proposal
Roberto Scano Accept the proposal
Frederick Boland Accept the proposal
Jan Richards Accept the proposal
Alessandro Miele Accept the proposal
Greg Pisocky
Alex Li Neutral - will accept the consensus of the group Not sure what "accessible test content file" is supposed to be.

I'd not sure the screen reader test should be part of the web content accessibility test procedure because bugs from screen reader will then be mistakenly attributed to the authoring tool or web content. The flip side is also true that just because screen reader present the content to users does not necessarily mean that the content meet WCAG 2.0 SCs.
Sueann Nichols Accept the proposal

2. A.1.2.1 tests

Success Criteria

A.1.2.1 Accessibility Guidelines: If the authoring tool contains non-web-based user interfaces, then those non-web-based user interfaces follow user interface accessibility guidelines for the platform. (Level A)

Test(s)

Test 0001 Assertion: Any non-web-based portions of the authoring tool user interface follow the accessibility guidelines for the platform.

  1. If the authoring tool is entirely web-based (i.e. lacking any functionality that runs outside of a user agent), then select SKIP.
  2. If the non-web-based parts include editing views (as opposed to only a file uploader, etc.), then load the accessible test content file (any level).
  3. If the accessibility guideline (for the platform) that was used by the developer is known, this should be used to test the accessibility of the user interface. If the user interface has followed the guidelines, then select PASS, otherwise select FAIL.
  4. If the accessibility guideline (for the platform) that was used by the developer is not known, choose the most relevant from this list of user interface accessibility guidelines for various platforms and use it to test the accessibility of the user interface. If you find instances where the user interface has not followed the guidelines, then select FAIL, otherwise select PASS.

Summary

ChoiceAll responders
Results
Accept the proposal 7
Recommend changes (see comments field)
The proposal needs more discussion (see comments field)
Disagree with the proposal (see comments field)
Neutral - will accept the consensus of the group 1

Details

Responder A.1.2.1 testsComments on A.1.2.1
Jeanne F Spellman Accept the proposal
Roberto Scano Accept the proposal
Frederick Boland Accept the proposal
Jan Richards Accept the proposal
Alessandro Miele Accept the proposal
Greg Pisocky Accept the proposal
Alex Li Neutral - will accept the consensus of the group Again, what is the accessible test content file?
Sueann Nichols Accept the proposal

3. A.1.2.2 tests

Success Criteria

A.1.2.2 Platform Accessibility Services: If the authoring tool contains non-web-based user interfaces, then those non-web-based user interfaces expose accessibility information through platform accessibility services. (Level A)

Test(s)

Test 0001 Assertion: Any non-web-based components of the authoring tool user interface successfully communicate name and role with the platform accessibility services (accessibility APIs).

  1. Examine the authoring tool user interface to determine whether any parts of it are not web-based (i.e. they run outside of a user agent).
  2. If the authoring tool is entirely web-based, then select SKIP.
  3. For each non-web-based operable user interface component in the authoring tool user interface:
    1. Check the component with the Platform Accessibility Service Test Procedure to determine if it is (a) present, (b) has an accessible name and (c) has an appropriate UI component role.
    2. If any of (a)-(c) are not the case for any user interface components, then select FAIL.
    3. Go to the next non-web-based operable user interface component (if any).
  4. Select PASS (all of the non-web-based components must have passed)

Summary

ChoiceAll responders
Results
Accept the proposal 8
Recommend changes (see comments field)
The proposal needs more discussion (see comments field)
Disagree with the proposal (see comments field)
Neutral - will accept the consensus of the group

Details

Responder A.1.2.2 testsComments on A.1.2.2
Jeanne F Spellman Accept the proposal
Roberto Scano Accept the proposal
Frederick Boland Accept the proposal
Jan Richards Accept the proposal
Alessandro Miele Accept the proposal
Greg Pisocky Accept the proposal
Alex Li Accept the proposal
Sueann Nichols Accept the proposal

4. A.2.1.1 tests

Success Criteria

A.2.1.1 Text Alternatives for Rendered Non-Text Content: If an editing-view renders non-text content, then any programmatically associated text alternatives for the non-text content can be programmatically determined. (Level A)

Test(s)

Test 0001 Assertion: Any editing views that render audio-video include an option to display alternatives or an option to preview the media in user agent capable of rendering the alternatives.

  1. If the authoring tool does not include editing views that are capable of rendering audio-video, then select SKIP.
  2. For each editing view that renders audio-video:
    1. Load the accessible test content file (any level), which contains time-based media (audio-video) with alternatives (e.g. captions, transcripts, audio-description), in the editing view.
    2. Check if the authoring tool allows the content being edited to be be previewed in a user agent (e.g. browser or media player) where the alternatives can be rendered. If so, go to the next editing view that renders audio-video.
    3. Check if the authoring tool can be set to render the alternatives itself. If not, select FAIL.
    4. Go to the next editing view that renders audio-video (if any).
  3. Select PASS (all editing views must have passed)

Test 0002 Assertion: Any editing views that render video-only media include an option to display alternatives or an option to preview the media in user agent capable of rendering the alternatives.

  1. If the authoring tool does not include editing views that are capable of rendering video-only, then select SKIP.
  2. For each editing view that renders video-only media:
    1. Load the accessible test content file (any level), which contains time-based media (video-only) with alternatives (e.g. transcripts, audio-description), in the editing view. If so, go to the next editing view that renders video-only media.
    2. Check if the authoring tool can be set to render the alternatives itself. If not, select FAIL.
    3. Go to the next editing view that renders video-only media (if any).
  3. Select PASS (all editing views must have passed)

Test 0003 Assertion: Any editing views that render audio-only media include an option to display alternatives or an option to preview the media in user agent capable of rendering the alternatives.

  1. If the authoring tool does not include editing views that are capable of rendering audio-only, then select SKIP.
  2. For each editing view that renders audio-only media:
    1. Load the accessible test content file (any level), which contains time-based media (audio-only) with alternatives (e.g. transcripts, alternative for time-based media), in the editing view.
    2. Check if the authoring tool allows the content being edited to be be previewed in a user agent (e.g. browser or media player) where the alternatives can be rendered. If so, go to the next editing view that renders audio-only media.
    3. Check if the authoring tool can be set to render the alternatives itself. If not, select FAIL.
    4. Go to the next editing view that renders audio-only media (if any).
  3. Select PASS (all editing views must have passed)

Summary

ChoiceAll responders
Results
Accept the proposal 6
Recommend changes (see comments field)
The proposal needs more discussion (see comments field) 1
Disagree with the proposal (see comments field) 1
Neutral - will accept the consensus of the group

Details

Responder A.2.1.1 testsComments on A.1.1.1
Jeanne F Spellman Accept the proposal
Roberto Scano Accept the proposal
Frederick Boland Accept the proposal
Jan Richards Accept the proposal
Alessandro Miele Accept the proposal
Greg Pisocky Disagree with the proposal (see comments field) Remeber this is an authoring tool for multimedia, either audio or video and as such, the alternatives may not be available because this is raw content. Needs some kind of qualifier that doesn't automatically assume that an alternative is already in existance. Your not going to have a closed caption transcript for video ready until the video has been edited. No one is going to bother producing alternatives for content that ends up on the cutting room floor. Chicken, meet egg. Egg say hello to chicken.
Alex Li The proposal needs more discussion (see comments field) Why does the test assume multiple editing views? I don't understand the language about "next editing view".

Test 0002 step 2.1 has ambiguous If statement.

The test should be whether authoring tool can render the alternative, not whether the authoring tool can be set to render the alternative. It is possible for the authoring tool to obtain user preference from the OS or other sources where it will automatically render alternate content.
Sueann Nichols Accept the proposal

5. A.2.1.2 tests

Success Criteria

A.2.1.2 Alternatives for Rendered Time-Based Media: If an editing-view renders time-based media, then at least one of the following is true: (Level A) (a) Option to Render: The authoring tool provides the option to render alternatives for the time-based media; or (b) User Agent Option: Authors have the option to preview the time-based media in a user agent that is able to render the alternatives.

Test(s)

Test 0001 Assertion: Any editing views that render audio-video include an option to display alternatives or an option to preview the media in user agent capable of rendering the alternatives.

  1. If the authoring tool does not include editing views that are capable of rendering audio-video, then select SKIP.
  2. For each editing view that renders audio-video:
    1. Load the accessible test content file (any level), which contains time-based media (audio-video) with alternatives (e.g. captions, transcripts, audio-description), in the editing view.
    2. Check if the authoring tool allows the content being edited to be be previewed in a user agent (e.g. browser or media player) where the alternatives can be rendered. If so, go to the next editing view that renders audio-video.
    3. Check if the authoring tool can be set to render the alternatives itself. If not, select FAIL.
    4. Go to the next editing view that renders audio-video (if any).
  3. Select PASS (all editing views must have passed)

Test 0002 Assertion: Any editing views that render video-only media include an option to display alternatives or an option to preview the media in user agent capable of rendering the alternatives.

  1. If the authoring tool does not include editing views that are capable of rendering video-only, then select SKIP.
  2. For each editing view that renders video-only media:
    1. Load the accessible test content file (any level), which contains time-based media (video-only) with alternatives (e.g. transcripts, audio-description), in the editing view. If so, go to the next editing view that renders video-only media.
    2. Check if the authoring tool can be set to render the alternatives itself. If not, select FAIL.
    3. Go to the next editing view that renders video-only media (if any).
  3. Select PASS (all editing views must have passed)

Test 0003 Assertion: Any editing views that render audio-only media include an option to display alternatives or an option to preview the media in user agent capable of rendering the alternatives.

  1. If the authoring tool does not include editing views that are capable of rendering audio-only, then select SKIP.
  2. For each editing view that renders audio-only media:
    1. Load the accessible test content file (any level), which contains time-based media (audio-only) with alternatives (e.g. transcripts, alternative for time-based media), in the editing view.
    2. Check if the authoring tool allows the content being edited to be be previewed in a user agent (e.g. browser or media player) where the alternatives can be rendered. If so, go to the next editing view that renders audio-only media.
    3. Check if the authoring tool can be set to render the alternatives itself. If not, select FAIL.
    4. Go to the next editing view that renders audio-only media (if any).
  3. Select PASS (all editing views must have passed)

Summary

ChoiceAll responders
Results
Accept the proposal 7
Recommend changes (see comments field)
The proposal needs more discussion (see comments field) 1
Disagree with the proposal (see comments field)
Neutral - will accept the consensus of the group

Details

Responder A.2.1.2 testsComments on A.2.1.2
Jeanne F Spellman Accept the proposal
Roberto Scano Accept the proposal
Frederick Boland Accept the proposal
Jan Richards Accept the proposal
Alessandro Miele Accept the proposal
Greg Pisocky Accept the proposal This one is different from A.1.1.1 because it addresses the issue of the tool having a mechanism to display these alternative options, but does not assume the alternative content is available.
Alex Li The proposal needs more discussion (see comments field) If the tests are exactly the same, then we have duplicative SC.
Sueann Nichols Accept the proposal

6. A.2.2.1 tests

Success Criteria

A.2.2.1 Editing-View Status Indicators: If an editing-view adds status indicators to the content being edited, then the information being conveyed by the status indicators can be programmatically determined. (Level A)

Note: Status indicators may indicate errors (e.g., spelling errors), tracked changes, hidden elements, or other information.

Test(s)

Test 0001 Assertion: For web-based tools: Status information about the content (e.g. spell checking) can be programmatically determined.

  1. If the editing view is non-web-based, then select SKIP.
  2. Check all of the editing-views for status indicators (often used to indicate errors (e.g., spelling errors), tracked changes, hidden elements, or other information.), possibly from a product feature list or trial and error.
  3. If the authoring tool does not provide status indicators in any of its editing-views, select SKIP.
  4. For each type of status indicator:
    1. Open or author content that will trigger the status indicator (e.g. with spelling errors, add a hidden element, etc.), then examine the indicator with the Web Content Accessibility Test Procedure. If an indicator does not pass, then select FAIL.
    2. Go to the next type of status indicator (if any).
  5. Select PASS (all indicators must have passed)

Test 0002 Assertion: For non-web-based tools: Status information about the content (e.g. spell checking) can be programmatically determined.

  1. If the editing view is web-based, then select SKIP.
  2. Check all of the editing-views for status indicators (often used to indicate errors (e.g., spelling errors), tracked changes, hidden elements, or other information.), possibly from a product feature list or trial and error.
  3. If the authoring tool does not provide status indicators in any of its editing-views, select SKIP.
  4. For each type of status indicator:
    1. Open or author content that will trigger the status indicator (e.g. with spelling errors, add a hidden element, etc.), then examine the indicator with the Platform Accessibility Service Test Procedure to determine if the indicator (e.g. the validation error) has been communicated to the accessibility API. If an indicator does not pass, then select FAIL.
    2. Go to the next type of status indicator (if any).
  5. Select PASS (all indicators must have passed)

Summary

ChoiceAll responders
Results
Accept the proposal 7
Recommend changes (see comments field)
The proposal needs more discussion (see comments field)
Disagree with the proposal (see comments field)
Neutral - will accept the consensus of the group 1

Details

Responder A.2.2.1 testsComments on A.2.2.1
Jeanne F Spellman Accept the proposal
Roberto Scano Accept the proposal
Frederick Boland Accept the proposal
Jan Richards Accept the proposal
Alessandro Miele Accept the proposal
Greg Pisocky Accept the proposal
Alex Li Neutral - will accept the consensus of the group Steps 2 indicates that the SC is not testable.
Sueann Nichols Accept the proposal

7. A.2.2.2 tests

Success Criteria

A.2.2.2 Access to Rendered Text Properties: If an editing-view renders any text formatting properties that authors can also edit using the editing-view, then the properties can be programmatically determined. (Level AA)

Test(s)

Test 0001 Assertion: For web-based tools, if rich text formatting can be produced then authors the same formatting is programmatically determinable within the authoring tool.

Test 0001 Author: Jan

  1. If the authoring tool does not include any web-based editing views, then select SKIP.
  2. If the authoring tool cannot be used to author rich text content (e.g. because it is only edits non-text graphics), then select SKIP.
  3. If the authoring tool can be used to author rich text content, but none of its editing views render this rich text content, then select SKIP. (Note: Some text editors render text content (e.g. markup tags, programming keywords, etc.) with various colors, bold weight, etc. These author supports are not part of the content and do not qualify here)
  4. For each editing view that renders rich text content:
    1. Add some text content and (one per line to avoid overlap), use as many rich text formats as possible (e.g. bold, italic, font face, superscript, etc).
    2. Produce a version of the final output (for comparison)
    3. Use a web content markup examination tool to examine how the rich text in the editing-view is actually presented via the user agent.
    4. If the markup matches that produced in the final output, then go the nextediting view that renders rich text content.
    5. If the rich text is presented via means that block programmatic determinability (e.g., using space gifs to mimic wider spacing, using a positioned div to mimic superscript), then select FAIL.
    6. Go the next editing view that renders rich text content (if any).
  5. Select PASS (all editing views must have passed)

Test 0002 Assertion: For non-web-based tools: Text formatting is available via the platform accessibilty service (e.g. API).

Test 0002 Author: Jan

  1. If the authoring tool omly includes web-based editing views, then select SKIP.
  2. If the authoring tool cannot be used to author rich text content (e.g. because it is only edits non-text graphics), then select SKIP.
  3. If the authoring tool can be used to author rich text content, but none of its editing views render this rich text content, then select SKIP. (Note: Some text editors render text content (e.g. markup tags, programming keywords, etc.) with various colors, bold weight, etc. These author supports are not part of the content and do not qualify here)
  4. For each editing view that renders rich text content:
    1. Add some text content and (one per line to avoid overlap), use as many rich text formats as possible (e.g. bold, italic, underlined, font face, superscript, etc).
    2. Using a screen reader that is appropriate to the platform attempt to identify the following text formatting properties (if they could be edited in the previous step): font face, font size, font color, whether text is bold, whether text is italic. If they cannot be identified, then select FAIL.
    3. Go the next editing view that renders rich text content (if any).
  5. Select PASS (all editing views must have passed)

Summary

ChoiceAll responders
Results
Accept the proposal 7
Recommend changes (see comments field)
The proposal needs more discussion (see comments field)
Disagree with the proposal (see comments field) 1
Neutral - will accept the consensus of the group

Details

Responder A.2.2.2 testsComments on A.2.2.2
Jeanne F Spellman Accept the proposal
Roberto Scano Accept the proposal
Frederick Boland Accept the proposal
Jan Richards Accept the proposal
Alessandro Miele Accept the proposal
Greg Pisocky Accept the proposal
Alex Li Disagree with the proposal (see comments field) All steps with "SKIP", specify where to skip to.

Screen readers should not be use for test 0002 4.2. Appropriate accessibility checkers should be used to identify programmatic exposure.
Sueann Nichols Accept the proposal

More details on responses

  • Jeanne F Spellman: last responded on 27, December 2012 at 21:58 (UTC)
  • Roberto Scano: last responded on 28, December 2012 at 07:46 (UTC)
  • Frederick Boland: last responded on 4, January 2013 at 21:49 (UTC)
  • Jan Richards: last responded on 5, January 2013 at 16:57 (UTC)
  • Alessandro Miele: last responded on 6, January 2013 at 13:35 (UTC)
  • Greg Pisocky: last responded on 7, January 2013 at 15:32 (UTC)
  • Alex Li: last responded on 25, January 2013 at 23:07 (UTC)
  • Sueann Nichols: last responded on 28, January 2013 at 18:33 (UTC)

Everybody has responded to this questionnaire.


Compact view of the results / list of email addresses of the responders

WBS home / Questionnaires / WG questionnaires / Answer this questionnaire