Summary of comments received in reaction to "Key results and
recommendations from Face to face"
Summary of comments received since Gregg sent, "Key
results and recommendations from Face to Face."
This is a very rough collection of notes to help me create summaries
for the WCAG WG. I provide this publicly in case anyone finds it interesting,
although expect that it will not be used by anyone but me.
1) Can't use UAAG as Baseline
It was concluded that UAAG 1.0 does not resolve the baseline issue
because it does not resolve key questions like whether script support is
provided.
We will therefore not be relying on UAAG as a baseline.
Comments:
- Agree. It also brings up dependency issues which may have benefits for
the adoption of UAAG but may get very tricky to handle in the future. (Shadi
Abou-Zahra msg #1)
- Just because UAAG 1.0 doesn't answer all questions doesn't mean it
shouldn't be a key source in profiling User Agent expectations. (Al
Gilman msg #1)
- Gregg
responds to Al: Agree. We didn't say we won't be using it. we
just said that it didn't answer the questions needed about baseline.
It didn't set a baseline for what authors could assume - only what UA
makers should provide.
2) WCAG not set any explicit baseline
WCAG will not set an explicit baseline; instead, external decision
makers will make baseline assumptions or requirements. These include [but
are not limited to] governments, customers, companies, managers, and
authors.
Comments:
3) WCAG written as functional outcomes and not assume user tools and
technologies
WCAG will not explicitly state what technologies are supported or what
tools users will have at their disposal. WCAG criteria should be written as
functional outcomes (see clarification #1) and therefore should not be
specific to any technologies such as scripting, css, etc.
Comments:
- Agree. I also think that proving WCAG 2.0 works on different tools and
technologies could be part of the implementations that get it to
Recommendation stage. (Shadi
Abou-Zahra msg #1)
- WCAG requirements should be identified at the functional outcomes
level. The W3C standard way
to publish this is a Requirements Document which has the status of a
Working Group Note.
In addition, requirements should be identified in bindings to a
variety of widely-used Web technologies.
These can and perhaps should be normative.
At the very least, they should be no less authoritative than the more
abstract requirements. (Al
Gilman msg #1)
4) With regard to baseline and techniques:
- Techniques can not be more restrictive than guidelines otherwise
techniques become normative. [and Techniques should never be
normative.]
- Techniques documents may provide multiple techniques and those
techniques may differ based on user agent assumptions. For example, we
could have 2 techniques: 1. how to make scripts accessible for user
agents and assist. tech that support scripts 2. how to write content in
such a way that if scripts are turned off the content degrades gracefully
(i.e., still usable w/out scripting). however, these two techniques are
not mutually exclusive and one or the other is used depending on what
technology choices are made.
Comments:
- Al Gilman (Al
Gilman msg #1
- Cannot follow the logic, here. Normative is not equal to restrictive
The UAAG has provided a fine example of declaring *sufficient*
techniques which are then normative, but not restrictive.
For example:
http://www.w3.org/TR/UAAG10/guidelines.html#tech-conditional-content
- Normative techniques would seem to be necessary to synchronize
practice between authoring tools and user tools. Take the example Jon
Gunderson is pushing us to answer: "How does the AT or UA know from
the markup that something is a navbar?" (so that go-to and
escape-from methods can be furnished for this object)" Authors won't
do what the browsers and AT don't support, and vice versa. Agreements
in the form of consensus conventions are the W3C way out of this
chicken-and-egg dilemma. How do you justify the above statement? It
seems straight backwards.
- If the techniques are not normative, how will anybody on one side
of the equation believe that the other side will fulfil the
contract?
- In one model of progress, practices evolve from experimental
techniques to known-good techniques to normative techniques to hard
technology (embedded in and enforced by the technology platform).
Normative techniques are an essential stage in the life-cycle of
technological practice.
- @@questionable. thesedays, normative techniques *rarely* seem to be
the ones that people take up. Look at how the browser wars even
started...none of that was normative. it is not normative techniqeus
that drive adoption, it's style, fashion, and other factors. If
normative techniques drove people to do things, we wouldn't have the
problems that we do with CSS, HTML, etc. etc.
- John Slatin asks, "this seems to work for technology, what about
content?"
- Al responds (Al
Gilman msg #2) Web content is nothing if not technology. It
is persistent data so that an author can input it and a user
extract it at different times. It is data on the wire between
software the author selected and software the user selected.
It is rules for capturing and encoding and decoding and
interpreting those data.
- Without addressing the message in its encoded (i.e.
technological) form, you can't distinguish between appropriate
and inappropriate Web practice.
- "structure your content" is unenforceably vague.
Guidelines for structure would be things like how much
timeToSpeak or how many active elements could be in an
unstructured section of web media. Then how to organize the
structure when you exceed those limits (as web pages routinely do
now).
- John responds - There is nothing in WCAG 2.0 as vague as
"structure your content." We are striving for success criteria
that are written as testable assertions about content in its
encoded form, and to provide sufficient and testable techniques
for encoding that content.
- Doyle asks, Shouldn't the testable techniques "decode" the
encoded content?
- John
responds (golden nugget) - I wrote the above as a
response to Al's very careful explanation of what
he meant in saying that Web content is a technology.
The central point I took away from his careful exposition
was that Web content is technology because in order to be
delivered via the Web all content (text, video,
etc., etc.) has to be "encoded" in some form such as
HTML, SVG, SMIL, Flash, PDF, JPEG, etc.
- With that in mind, I think it makes sense to think of
WCAG as specifying success criteria for content as
encoded for the Web, and to think of the techniques
documents as providing advice about how to do the actual
encoding. The tests then verify whether the encoding has
been done correctly.
- Jason
responds: Gregg quite ingeniously suggested tightening
the success criteria to demand that structure, text
equivalents, lexicographical information and other aspects of
the content required by the guidelines should be encoded in a
"standard and supported manner". The idea was that what
"standard and supported" meant could be construed as an
empirical question to which the techniques documents would
provide answers for each format or
language. "Standard and supported" would then be defined
in terms of conformance to technical specifications, and the
adoption of practices supported "in the field" (Gregg's
words, if I recall correctly), either by implementations or
as suggested by documents such as WCAG techniques.
- Another way of interpretin g the proposal, however, would
be to argue that it effectively makes the techniques
normative: where there is no established practice or
implementation experience that settles what "standard and
supported" amounts to, the techniques would become de facto
authoritative. This could be seen as crossing the
demarcational line between informative and normative by
effectively requiring the techniques documents to be followed
in order to satisfy the "standard and supported" requirement.
- Gregg
responds: Pretty good Jason, except where you made
the leap to "if there is no standard than whatever
techniques says is de facto standard". This is not true.
If there is no standard and supported manner then there
is none -- and you can't comply with that technology. Not
being in the techniques doc however does not mean it
doesn't exist. And a technique that is in the techniques
doc that is not labeled as standard and supported is also
not sufficient. If a technique is in the techniques doc
as a standard and supported technique and it is not -
then the techniques doc is in error but still the
technique does not satisfy the guidelines.
- The only way to satisfy the guidelines or their success
criteria is to satisfy the success criteria. The
techniques docs just document known ways to do that (and
the GUIDE doc also helps you understand the SC.) But
things can satisfy the SC that are NOT in the techniques
doc. And there will be techniques in the techniques doc
that do not satisfy the SC.
- Jason
responds: Suppose a format specification provides
features that allow a certain requirement to be met
in several different ways. That is, there are various
distinct usage practices that would satisfy the WCAG
success criterion. To support accessibility, content
developers and software implementors need to know
which to support.
- The most likely outcome is that the techniques
documents would fill the void left by the absence of,
or inconsistencies among, usage practices with
respect to the success criterion. In substance, the
techniques documents would be legislating by
specifying one or more of the alternative solutions
allowed by the format specification; and this is what
would give rise to the accusation I mentioned of
conferring a de facto authoritative status on the
techniques.
- One answer might be to adopt a technique
development process whereby proposed techniques are
implemented first, tested for efficacy, and only then
integrated into a document; the techniques documents
might then be truly statements of empirical fact
rather than disguised norms.
- Gregg
responds: What is in the techniques doc does
not change that fact [that if there are several
acceptable methods, there are several. Or, if
there is only one, there is only one]. Being in
the techniques doc may make one approach more
known. And it may give the user of the technique
more evidence that it is a good technique. And it
may be used more. (is that what you mean by
de-facto standard?). But it does not make it
normative. And it does not require the the user
to use it to conform.
- Perhaps we are confusing the term "standard" in
its two meanings. One is "the way most people do
it" and the other is " the way you must do it to
conform".
- John
Responds: Perhaps "conventional" or
"typical" would be better terms to use for
that first sense of the word "standard." For
example, it has become "conventional" to use
the link text "More ..." for a link to the
continuation of a news item. By contrast, the
HTML standard (in this case a specification)
requires that the link text is enclosed
within an anchor element.
- Jason
responds: excellent. perhaps define a
"conventional and supported manner" as:
- A manner prescribed in a technical
specification defining the
technologies used to implement the
content.
- A manner which has become customary
within the community of Web
content developers at large, or
among specialists in the design of
accessible content.
"Supported" would have to be defined
in terms of implementation by user agents
or other applicable tools (e.g., content
validation and testing software). Again,
it would have to be decided what the
minimum necessary level of implementation
was, bringing us back to the difficult
question of user agent support.
- Neil
responds to Jason: It's
impossible to define a normative
baseline. All you can do is state the
baseline used in compiling the
guidelines. The baseline
responsibility falls to authors and
publishers who know their audiences.
Authors will adjust baseline using a
variety of data: access logs,
published data, word of mouth, client
requirements, and financial
constraints. Question becomes, "How
do authors identify the level at
which they have set their baseline?
Moreover, how do they claim
compliance if their baseline is below
WCAG published baseline? Is it
possible to test varying baselines?"
[@@unclear what the "WCAG published
baseline is - that which we assume
when writing the guidelines?]
- Jason
responds to Neil: The most
that should be required by WCAG
2.0 is that there either exist a
UAAG 1.0-conformant
implementation of the
technologies relied upon by the
content; or such repair
technqiues be implemented as to
compensate for the shortfall of
at least one implementation from
UAAG 1.0.
- Informative advice regarding
user agent support should be
provided in techniques.
- Widespread availability of
implementations is desirable but
should not be part of any
conformance requirement at least
at level 1 and preferably not at
all.
- Jason
responds (to gregg) (golden nugget):
Suppose we decided that we wanted a means of
distinguishing "layout" tables in HTML from
genuine data tables. The HTML spec doesn't
provide for this, and in order to be
practically useful the chosen technique would
have to be adopted by authors and authoring
tools. It would also have to be recognized by
user agents and assistive technologies.
- Gregg
responds to Jason
- we can't put something in
guidelines that can't be done. assume
that may be able to do it with one
technology but not another.
- If it can't be done with technology
x, then papers, standards, or work
with UA should be done to make it
happen. We shouldn't put something in
the techniques doc as required if it
can't already be done. [@@ doesn't he
say that nothing in the techniques
doc is required?]
- techniques that are not standard
are advisory
- guidelines do not require that you
follow techniques
- techniques help authors understand
guidelines/SC, provide examples and
methods of how to do something
- techniques that satisfy the SC but
are not in our techniques are can be
used
- nugget: the techniques doc can
neither make something 'required' by
including it or make it not
acceptable by omitting it.
- Neil
responds to Gregg:
- Firstly, a "requirement" is a
"requirement", irrespective of
whether it is currently possible
to satisfy. A requirement that
cannot be satisfied yet, for
whatever reason, cannot be
eliminated, only deferred.
- At least one technique should
be offered per SC in order to
*show* that it is actually
possible to satisfy the SC and to
demonstrate *how* it might be
done. If there is no current way
to satisfy the SC, the SC must be
deferred. Possible ways to defer
Guidelines/SC:
- Remove them from the
current version and
re-instate in a later version
as and when the barriers are
removed. This would require a
separate set of documents
that shows which SCs are
deferred and why.
- Create a level 4, just for
deferred SCs, and promote
them as the barriers are
removed. This way, you can
have a techniques document
for level 4 SCs that is just
informative. As long as it is
clearly stated that this
level is included for
completeness only
- Leave them in the
guidelines (where they are)
knowing that authors will not
be able to satisfy the
requirements and write
techniques that are not
possible to implement.
- Gregg
responds to Neil:
- What is needed, then, is an agreed upon
technique that content and software
developers could implement, each under the
expectation that the other parties would
support it. The question, then, is how this
technique would be chosen, given that there
are various ways(consistently with the HTML
spec) of glossing the semantics of table
markup.
- if WCAG techniques were to step in at this
point and recommend one usage as the norm,
then we would be trying to set a standard in
the techniques rather than recording
established facts, with the expectation that
software and content developers would follow
what was in the techniques. Of course it
might be argued that the techniques would
just be making one of these methods better
known; but the problem is that there aren't
really, in this case, two or more acceptable
methods; the technique will only work if it's
adopted by tool and user agent developers.
Having an agreed upon technique is almost
more important than what the technique is.
The issue I understood Al to be raising is
how these issues would be settled at the
technology-specific level if WCAG didn't
provide normative requirements at this level.
Such a problem is distinct from the so-called
"baseline" issue, but it does raise important
questions.
- The proposal thus transforms the problem to that of
providing sufficiently applicable and testable criteria for
determining whether a particular technique for applying a
success criterion using a given technology is "standard and
supported"; and I think the question is still open whether
this can be made workable.
5) Tests not set baseline
Tests will not set a baseline. Multiple tests may be provided to
correspond to multiple techniques.
Comments
- Without agreed methods of observing the de-jure normative functional
outcomes, will not the guidelines be unenforceably vague? (Al
Gilman msg #1
Baseline or not
Question seems to boil down to baseline or not.
- Shadi
Abou-Zahra msg #1
- If you will (need to) provide multiple techniques anyway, why not
define a default for "authors" who can not (for several reasons) make
assumptions on their audience?
- Because you do not define a baseline, you need to define
conformance to the Guidelines under the assumption that user agents
and assistive technologies support scripting; as well as for the
opposite assumption.
- Since you already (implicitly) define conformance to both
assumptions anyway, why not define that (for example) "scripts turned
off" is the default unless "authors" can safely assume a different
audience?
- Wendy
Chisholm
- To address these concerns, I believe there was some
discussion about the following points at the face-to-face
(although not part of the proposed resolutions and no
consensus was called for these):
- The one assumption that everyone should make is that
people with disabilities are in the audience.
- If a decision maker (a government, a customer, a
company, a manager, or an author) *can* make further
assumptions about the audience (either because they are a
government that gives tools to its citizens or a company
that sells an enterprise application that requires
specific technology), then an alternative is not needed
(for technology that might turned off or not
supported).
- If a decision maker *can not* make further assumptions
about the audience (because the decision maker is
publishing to the whole Web or doesn't have control over
user tools), then the content is functional when
technologies are turned off or not supported *or* an
alternative must be provided.
- Jason
White:
- Loretta
Guarino Reid msg #1
- I claim that what your example is proposing is that we should
define a baseline, since the baseline is what
authors can assume about their audience, and that you are
proposing that it should not include scripting. If this is part of
the Guidelines themselves, it becomes normative. I think this means
it is impossible for a government, for instance, to define a
different baseline and still harmonize with WCAG2. And that, as
technology continues to progress, the baseline could only be shifted
by revising WCAG.
- One suggestion from the Face to Face was that WAI should provide
guidance and information to assist in
defining sensible baselines. I think this would mean a repository
of information about User Agents.
- Neil Whiteley: I would argue to the contrary. In fact I would
go as far as to say that governments and other organisations will
be looking to WCAG for a normative baseline upon which to build.
If organisations wish to develop their own extended baselines
then there is nothing to stop them in that pursuit however if all
were to build upon a common foundation, this would promote
greater harmony in the long haul.
- Neil
Whiteley
- Jason
White - scenarios to avoid
- Tim Boland - QA
Concerns re: WCAG2.0 baseline and structure - unclear of the
questions.
$Date: 2005/03/31 01:52:13 $ Wendy Chisholm