Copyright ©2002 W3C ¨ (MIT, INRIA, Keio), All Rights Reserved. W3C liability, trademark, document use and software licensing rules apply.
This document is part of the of the Quality Assurance (QA) Activity. It presents examples and describes the techniques of operational aspects of quality practices within the W3C's Working Groups. It complements "QA Framework: Operational Guidelines"[QAF-OPS], by specifying or illustrating how to meet the operational and process-related checkpoints of that document.
This version is a Editors-only draft. SOTD for FPWD follows...
This section describes the status of this document at the time of its publication. Other documents may supersede this document. The latest status of this document series is maintained at the W3C.
This document is a W3C Working Draft (WD), made available by the W3C Quality Assurance (QA) Activity for discussion by W3C members and other interested parties. For more information about the QA Activity, please see the QA Activity statement.
This version is the first public Working Draft, and supersedes all previous WG-only drafts. It is expected that updated WD versions of this document will be produced regularly, along with other members of the Framework documents family. Future progression of this document beyond Working Draft is possible, but has not yet been determined.
In this version, several example case studies are presented in some detail -- how the QA projects of three existing Working Groups relate to the checkpoints of the operational guidelines [QAF-OPS]. It is expected that future versions will add at least one example to the current three (SVG, DOM, and XML). This version contains relatively little development of "techniques" -- what Working Groups should do to satisfy the operational and process-related checkpoints, and what constitutes conformance to the checkpoints. This is expected to be a significant addition in future versions.
Please send comments to www-qa@w3.org, the publicly archived list of the QA Interest Group [QAIG]. Please note that any mail sent to this list will be publicly archived and available, do not send information you wouldn't want to see distributed, such as private data.
Publication of this document does not imply endorsement by the W3C, its membership or its staff. This is a draft document and may be updated, replaced, or obsoleted by other documents at any time. It is inappropriate to use W3C Working Drafts as reference material or to cite them as other than "work in progress".
A list of current W3C Recommendations and other technical documents can be found at http://www.w3.org/TR.
1. Introduction
2. Anatomy of this document
3. Conformance test suite projects
3.1 DOM
3.2 SVG
3.3 XML
3.4 XSLT
4. Operational guidelines in action
G 1. Integrate Quality Assurance into Working Group
activities.
G 2. Define
resources for Working Group QA activities.
G 3. Synchronize QA activities with the milestones for
WG deliverables.
G 4. Define the QA process.
G 5. QA
Process: Plan test materials development.
G 6. QA
Process: Plan test materials publication.
G 7. Plan the transfer of test materials to W3C if
needed.
G 8. Plan
for test materials maintenance.
5. General remarks
6. Conformance
7. Acknowledgments
8. References
This document contains examples and techniques of QA-related activities within the W3C as well as guidance as to how the quality processes of the Working Groups (WGs) should be carried out. It complements "QA Framework: Operational Guidelines" [QAF-OPS], by specifying or illustrating how to meet the operational and process-related checkpoints of that document.
The operational guidelines document should be considered a prerequisite to this one. In addition to containing all of the checkpoints to which the contents of this document pertain, it explains principles underlying the conduct of WG quality processes.
Four test suite (TS) efforts related to W3C Recommendations are used as examples. Some of them were written in-house (within the responsible WG), some externally (outside of W3C), and some are hybrid (external plus in-house effort). All of them have succeeded in producing significant and valuable TS deliverables.
In all four cases, the associated WGs have existed and were well advanced on the relevant Recommendations before the operational guidelines [QAF-OPS] were first published. Because these projects preceded the guidelines, it makes no sense to score them against checkpoints -- it is unreasonable to expect them to meet all of the checkpoints. Instead, the various aspects of the different projects are used to illustrate the meaning and intent of the checkpoints. In a sense, the checkpoints are the union of the best practice and techniques from all of projects (these and others).
This points to a timing-based reality of conformance to the operational guidelines. The conformance criteria ("techniques", to be developed and added in the future), will be different for new Working Groups versus existing Working Groups.
This document is heavily weighted towards case studies, examples and illustrations. It has relatively little specification of techniques for meeting checkpoints, or criteria for judging conformance to the checkpoints.
The structure of this document parallels the structure of the operational guidelines [QAF-OPS] document, particularly its Chapter 3, "Guidelines". It should be realized that all of the given examples of W3C Working Groups' QA-related activities were started before the applicable parts of the W3C QA Activity. The useful milestone for the latter is the first publication of the Operational Guidelines, February 2002.
Relevant common features in these example frameworks will be pointed out, and they will be compared against the statements of the checkpoints of the operational guidelines. For each of the given example QA-related frameworks, a short description of each will be presented. The subsequent parts of the document will elaborate on the checkpoints -- including links to relevant process-related documents and resources -- and how they have been realized in various efforts.
Herein is a summary description of each of a small set of case studies of W3C QA-related activities, as well as some external to the W3C.
Each subsection covers the following topics:
The DOM TS was jointly launched by the W3C and NIST and is produced in public. It uses coordination from the W3C DOM WG, and is produced using communication through a W3C-hosted mailing list as well as W3C-supplied storage for files, documents and so forth. The TS was to be developed in public, and the WG would endorse two bindings, Java and ECMAscript, the same as the official DOM bindings. Other bindings were welcomed but would not form part of the official TS.
The DOM TS was launched by the W3C DOM WG and NIST in an effort to invite the public to participate in the production of quality testing material for implementations of the DOM specifications. Staffing requirements were limited, so an expert was invited to the DOM WG in order to produce and edit the DOM TS Process Document. The DOM TS was in its initial phase very similar to a previously launched DOM Test Suite launched by NIST, which was used as a primer for the forthcoming work.[@@include links to relevant emails]
Since the DOM charter did not include sections on producing test materials, this was included in the DOM TS process, which outlined the DOM TS deliverables in detail.
The DOM TS was produced after specification for the first two levels (DOM Level 1 & 2) and is produced in parallel with specifying DOM Level 3. DOM Level 1 was released in late 1998, so it was not possible to synchronise the specification of the firsl level of the ODM with the Test Suite.
Starting with the current level of the DOM, the TS is produced in parallel with the specification. Tests are being produced to allow for implementors to check their implenentations of DOM level 3 while contributing to relevant parts of the specification. Also, members of the DOM WG have commited to produce tests for those modules of the ODM that are particularly relevant to their implementations.
There is a separate DOM TS Process Document, and also a Test Web page where you can find more information and pointers to primers, FAQ's and tutorials, maintained by the DOM WG representative to the DOM TS.
The DOM TS uses W3C machinery to store test files, using CVS for easy tracking, and an issue tracking system at Sourceforge, in order to simplify issues and maintenance.
As used in this case study, "SVG" refers collectively to two consecutively chartered Working Groups. SVG1 refers to the originally chartered group (first meeting, Sept 1998) that produced SVG 1.0 (Recommendation, Sept 2001). SVG2 is the successor to SVG1, (re-)chartered in 2001, and charged with developing SVG 1.1 (modularization and profiles), and SVG 2.0. SVG 1.1 is presently at the stage of CR.
SVG 1.0 was released with a comprehensive "Basic Effectivity" (BE) test suite, that was developed entirely within the working group. The test suite (TS) comprises 127 test cases that span the functionality of SVG, exercising each major functional feature of SVG 1.0 at least lightly, but not probing any features in detail. SVG2 is adapting and subsetting the tests of the SVG1 TS for smaller and less capable mobile devices that are the target constituency of the Tiny and Basic profiles.
The SVG1 test suite was developed entirely by the working group. A pilot study sponsored by one of the members became the basis for design. A "TS Editor" (moderator) managed the project from inception to completion. All WG members are expected to participate in the writing and review of test cases. The SVG2 approach is identical -- a WG member is the TS moderator, and all WG members contribute.
Both SVG1 & SVG2 charters specify a "comprehensive test suite" as a deliverable, and coordinate the dates of first TS delivery (first publication) and the specification CR milestone. There is no further detailed breakdown of milestones or deliverables, as this was worked out (for SVG1) as the specification development progressed.
The test suites were (are) produced during the specification-writing phase. The target is substantial completion at the CR milestone, so that the TS could be used to facilitate the "interoperable implementations" W3C Process requirements of CR.
SVG has no QA process document per se, but much of the content that these operational checkpoints anticipate and require may be found in the extensive reference document, "SVG Conformance Test Suite -- Test Builder's Manual". Unless otherwise specified in that document, the normal WG processes of the SVG Working Group are used to conduct the TS work.
The SVG WG keeps the SVG TS on a CVS server of one of the WG members. Revision tracking is therefore handled by CVS. Maintenance procedures and issue tracking for SVG TS are not separately defined, but rather SVG TS uses the normal SVG WG processes for these.
The XML Conformance Test Suite, v1.0, Second Edition contains over 2000 test files and an associated test report. The test report contains background information on conformance testing for XML as well as test descriptions for each of the test files included in the TS.
The OASIS/NIST XML Test Suite forms the basis for the W3C XML TS. It was transferred to W3C where it is being augmented with new functionality, corrected to align with the Recommendation and its errata, and maintained. Prior to accepting the OASIS/NIST Test Suite, the XML Core WG assessed the level of effort necessary for TS development and maintenance and ensured that staff resources were interested and available to work on the TS. Additionally, the WG enlisted the assistance of NIST to lead the effort. The TS is being developed as a deliverable of the WG, following the WG's Process. There is a W3C hosted XML TS mailing list, Issues list, and W3C supplied storage (CVS) for the test-suite related files.
A commitment from NIST to take editor-lead responsibility for the TS was obtained prior to the commencement of the work. Consequently, NIST joined the WG. Members of the XML Core WG contribute to the development and quality assessment of the TS. When the XML TS is released by the WG, the general public will be invited to review the tests as well as participate in contributing tests.
The W3C XML Core WG charter does not include sections on producing test materials. A new charter provides for the possibility of working on the OASIS/NIST XML Test Suite. The WG does not have a separate QA Process, but follows the WG Process for the development, assessment and maintenance of the XML TS. This is not explicitly documented in a separate QA process document, but the individual decisions are on record in WG mail archives and minutes as part of the normal WG process.
The work on the OASIS/NIST XML Test Suite was conducted after the XML 1.0 Recommendation was released. A revised edition of the OASIS/NIST Test Suite 1.0 (Second Edition) Test Suite was released to complement the W3C XML 1.0 Second Edition Recommendation (October 6, 2000). Within the XML Core WG, the XML TS is being updated to reflect the current Recommendation and all applicable Errata.
There is no separate QA Process document. The XML Core WG charter uses its established WG Process to handle test development/assessment activities, communication means and test materials related functions.
The XML WG uses the W3C CVS repository to maintain the XML TS. The XML TS issues are maintained utilizing an internal assessment process that consists of a running log of test issues, XML TS mailing lists, XML Core WG discussions and the XML Test Suite Issues document that keeps track of the running log of issues.
[@@ to be done. Probably must wait till next version after FPWD.]
This section looks at the guidelines and checkpoints in the operational guidelines [QAF-OPS ], and how their requirements are reflected in the various existing frameworks. Where appropriate, pointers to relevant documents and materials are provided.
The structure of this section exactly parallels the structure of the its Chapter 3, "Guidelines", of the operational guidelines [QAF-OPS] document. Each checkpoint links back to the corresponding checkpoint of "QA Framework: Operational Guidelines" [QAF-OPS].
Under each checkpoint, there is discussion of how each of the example QA or TS projects relate to the checkpoint. Finally, there will be a summary enumeration of ways in which the each checkpoint may be satisfactorily satisfied [@@this synthesis will be done in a future version].
Guideline 1. Integrate Quality Assurance into Working Group activities.
DOM discussion
In the DOM TS, the QA activities were integrated into Working Group activities by inviting a field expert to do the coordination between WG and TS Group. Coordination was necessary as the DOM WG QA activity was launched in coordination with an external party, and, furthermore, was to be produced externally.
SVG discussion
QA was integrated into the SVG WG activities by charter commitment to produce test suites, and subsequent involvement of most WG members in the writing, reviewing, and maintenance of the test materials. The charters committed, in the cases of both SVG1 and SVG2, that a "basic test suite" (i.e., Basic Effectivity) would exist at or before Candidate Recommendation. The details of deliverables, milestones, and publication were generally worked out by the WG as the specifications developed.
XML discussion
The QA activities are integrated into WG activities by using the WG Process. Although the activities are not reflected in the WG Charter, deliverables are identified, expected release dates determined, staff allocated, and there is a clear commitment to developing and maintaining the TS.
Checkpoints
Checkpoint 1.1. In the Charter, specify the level of commitment to QA. Plan to have at least a commitment to "QA level three". [Priority 1]
Guidelines document description of checkpoint.
Checkpoint 1.2. In the Charter, plan at least for the "QA level five". [Priority 2]
Guidelines document description of checkpoint.
Checkpoint 1.3. In the Charter, plan for the "QA level seven". [Priority 3]
Guidelines document description of checkpoint.
Checkpoint 1.4. In the Charter, enumerate QA deliverables and expected milestones. [Priority 1]
Guidelines document description of checkpoint.
Checkpoint 1.5. In the Charter, associate QA criteria with WG milestones. [Priority 2]
Guidelines document description of checkpoint.
Guideline 2. Define resources for Working Group QA activities.
DOM discussion
The DOM WG appointed an invited expert to monitor and coordinate the work of the DOM TS. Furthermore, the members of the DOM TS have looked into producing tests fot that part of the specification which is of particular interest to their implementations.
SVG discussion
The SVG1 and SVG2 charters explicitly charge the WG with producing test materials, but do not further address specific staffing. In fact, the expectation has evolved that all WG members should contribute to the building of the TS. In both cases, a "TS Editor" has been assigned to lead and coordinate the TS production.
XML discussion
The WG appointed NIST to lead the TS effort and coordinate the work of the XML TS. Furthermore, members of the WG have reviewed and assessed the current set of tests, corrected tests, and contributed tests.
Checkpoints
Checkpoint 2.1. In the Charter, address where and how conformance test materials will be produced. [Priority 1]
Guidelines document description of checkpoint.
Checkpoint 2.2. In the Charter, address QA staffing commitments. [Priority 1]
Guidelines document description of checkpoint.
Checkpoint 2.3. In the Call For Participation, request allocation of QA resources to the Working Group. [Priority 1]
Guidelines document description of checkpoint.
Guideline 3. Synchronize QA activities with the milestones for WG deliverables.
DOM discussion
The milestones were given in the DOM TS Process document and were monitored by the DOM WG in order to assure the satisfactory continuation of the DOM TS group work. In those cases where the TS Group needed more time to achieve its goals, the DOM WG representative reported to the WG in order to appeal for an extension.
Applicable in so far as the TS releases are (for levels currently being specified) synchronized with the relevant modules of the DOM specifications.
SVG discussion
The SVG1 and SVG2 charters establish the basic synchronization, which is that the first publications of basic test suites (TS) should be at or before the beginning of CR stage. Beyond that, there are no explicit (i.e., written down) synchronizations. This happens in fact as part of the WG process and the TS building. TS publications are synchronized with selected late-stage document versions (CR, PR, Rec). There is no explicit scheme for labelling/linking test cases according to errata, although this would be reconstructable from the CVS repository for the TS.
XML discussion
The TS work is being conducted in close coordination with the WG's other deliverables. The TS is being updated and augmented to reflect the current work of the WG, including resolved issues related to the Recommendation and published Errata. Tests for new functional areas are also being explored as the deliverables for those Recommendations progress (i.e., Xinclude and Namespace).
Checkpoints
Checkpoint 3.1. Synchronize the publication of QA deliverables and the specification's drafts. [Priority 2]
Guidelines document description of checkpoint.
Checkpoint 3.2. Support specification versioning/errata in the QA deliverables. [Priority 3]
Guidelines document description of checkpoint.
Guideline 4. Define the QA process.
DOM discussion
DOM TS Process document
SVG discussion
SVG support the process requirements of a QA moderator (manager of TS development) and a "task force" (it is the whole WG, by explicit choice). TS-related communication is handled internally on the whole-WG list, and externally via a public TS comment list. SVG does not have a Process document, per se, that spells out these details. It does have a TS manual that defines the TS framework in some detail.
XML discussion
The WG Process is used.
Checkpoints
Checkpoint 4.1. Appoint a QA moderator. [Priority 2]
Guidelines document description of checkpoint.
Checkpoint 4.2. Appoint a QA task force. [Priority 2]
Guidelines document description of checkpoint.
Checkpoint 4.3. Produce the QA Process document. [Priority 1]
Guidelines document description of checkpoint.
Checkpoint 4.4. In the QA Process document, define means for QA-related communication. [Priority 2]
@@Why is this checkpoint limited to "email"? Were does TS home page fit in?@@
Guidelines document description of checkpoint.
Checkpoint 4.5. In the QA Process document, specify the framework for test materials development. [Priority 2]
@@Even in 0311 draft, I don't understand the scope and usage of "framework".@@
Guidelines document description of checkpoint.
Checkpoint 4.6. Specify a policy for branding materials. [Priority 3]
Guidelines document description of checkpoint.
Guideline 5. QA Process: Plan test materials development.
DOM discussion
As the DOM TS drew on experiences made by an external party (NIST), the work initially drew heavily on those experiences. Where the subsequent work indicated digression, the future version of the DOM TS was branched off and formed the official W3C DOM TS. The test material planning was made by the DOM WG representative to the DOM TS in coordination with NIST and the public, after the DOM TS had been officially launched.
Several design issues were discussed by the DOM TS Group itself, then brought to the DOM WG for acceptance. Such issues include the automatic generation of the schema language for the DOM TS, using the same test description file to generate the official bindings (and the unofficial ones using a different transform) as well as using particular build tools to build working and official releases of the DOM TS.
Most discussion was done on the DOM Group mailing list and coordinated with the DOM WG. More information can be found in the DOM TS page, which serves as a complete reference for DOM TS related information.
SVG discussion
The planning for SVG's test materials development started with a study and prototype design, sponsored by one of the SVG WG members. This work was brought into the WG as the starting point of the TS development, and was further developed, refined, and applied by the moderator ("TS Editor"). Both user documentation (for TS releases) and contributors documentation were developed. The latter described the TS framework in some detail, and gave both technical criteria for the contributed test cases plus review/acceptance criteria. The actually processes and procedures for contribution and review were not explicitly defined, as all materials came from WG members (and therefore followed WG process).
XML discussion
The XML TS drew on the experiences and expertise of NIST, the original developer of the test suite framework. The original work was revised to included test requirements with a mapping to the test cases and to the specification. The planning of improvements and augmentation to the original work was discussed and decided by the WG. Techniques and methods were borrowed from other test suite development efforts, such as DOM and XSL-FO.
Checkpoints
Checkpoint 5.1. In the test framework, ensure test materials are documented and usable for the intended purpose. [Priority 2]
Guidelines document description of checkpoint.
Checkpoint 5.2. In the QA Process document, define a contribution process. [Priority 2]
Guidelines document description of checkpoint.
Checkpoint 5.3. In the QA Process document, define the licenses applicable to submitted test materials. [Priority 2]
Guidelines document description of checkpoint.
Checkpoint 5.4. In QA Process document, define review procedures for submitted test materials. [Priority 2]
Guidelines document description of checkpoint.
Checkpoint 5.5. Test materials review process includes but not limited to checking for accuracy, scope and clarity of the tests. [Priority 1]
Guidelines document description of checkpoint.
Guideline 6. QA Process: Plan test materials publication.
DOM discussion
Plans for test material publication are discussed internally in the DOM TS group, and then decided by the DOM TS Group in coordination with the DOM WG.
SVG discussion
The SVG TS project has explicitly discussed and planned the key aspects of TS publication. CVS repository is used for the ongoing development of the TS, and at agreed milestones, a snapshot is taken and published under the W3C Document License. The TS home page links all of the materials and documentation, and also points to an implementation matrix that publishes the TS results of a half-dozen or so implementations.
XML discussion
Publication of the TS is being discussed by the WG. The TS will reside on the W3C web space and will be freely and publicly available. In general, the publication details of the XML TS are relatively invisible outside of the WG, because there has not been any publication yet.
Checkpoints
Checkpoint 6.1. Ensure a secure and reliable repository location for future test materials. [Priority 2]
Guidelines document description of checkpoint.
Checkpoint 6.2. In the QA Process document, define the licenses applicable to published test materials. [Priority 2]
Guidelines document description of checkpoint.
Checkpoint 6.3. In the QA Process document, describe how the test materials will be published and point to the corresponding webpage. [Priority 2]
Guidelines document description of checkpoint.
Checkpoint 6.4. Provide a disclaimer regarding the use of the test materials for compliance verification. [Priority 2]
Guidelines document description of checkpoint.
Checkpoint 6.5. In the QA Process document, describe how test results for the products can be published. [Priority 3]
Guidelines document description of checkpoint.
Guideline 7. Plan the transfer of test materials to W3C if needed.
DOM discussion
This issue was addressed early on in the specification phase of the DOM TS work, as a choice had to be made between integrating an existing TS or just using the existing one as a primer for its succesor. The latter was preferred.
This is addressed in the DOM TS Process document.
SVG discussion
This guideline is not applicable to the SVG TS -- all TS development so far has occurred within the SVG WG.
XML discussion
The issue was addressed prior to the transfer of the TS to XML WG, and prior to the start of TS development and maintenance there. Since that phase is finished, it is no longer applicable. Included in the discussions were the quality of the OASIS/NIST XML Test Suite, amount of work needed to correct and improve it, as well as IPR questions.
Checkpoints
Checkpoint 7.1. If transfer of the test materials to W3C is planned, perform an assessment of their quality. [Priority 2]
Guidelines document description of checkpoint.
Checkpoint 7.2. If transfer of the test materials to W3C is planned, identify sufficient staff resources to meet the requirements. [Priority 2]
Guidelines document description of checkpoint.
Checkpoint 7.3. If transfer of the test materials to W3C is planned, resolve IPR questions and reach agreement with the external party that produced test materials. [Priority 1]
Guidelines document description of checkpoint.
Guideline 8. Plan for test materials maintenance.
DOM discussion
No clear plan for test material maintenance has been made, other than indicating in the process document that these issues will be adressed by the DOM TS group.
Not applicable
SVG discussion
[@@ to be done]
XML discussion
No clear plan for test material maintenance has been made yet. The need to address this has been acknowledged by the WG, with plans to address it in the near future.
Checkpoints
Checkpoint 8.1. Maintain contribution and review procedures throughout test materials' and standard's entire life cycles. [Priority 3]
Guidelines document description of checkpoint.
Checkpoint 8.2. In the Working Group's QA process document, specify a procedure for updates of the test materials to track new errata/specification versions. [Priority 2]
Guidelines document description of checkpoint.
Checkpoint 8.3. In the Working Group's QA process document, identify the communication channel and procedure for appeals of tests validity. [Priority 2]
Guidelines document description of checkpoint.
[@@Issue. Should we have a section like this? "Assessment" or "Lessons learned" or "..."? It would draw general lessons from the three examples, assess the effectiveness and even *goodness* of the approaches, compare and contrast, critique, etc. If "yes" ... can we postpone it to next publication, and just go w/ examples for FPWD? (Don't think there's time to develop it properly, and it could be sensitive, particularly if it contains "critique".)]
[@@As just proposed, the third bullet below would be in the scope of the section, but the first two probably wouldn't.]
Experiences drawn from previous efforts, obvious pitfalls, good examples, open issues, and so forth.
DOM discussion
There are a series of issues with the DOM TS:
SVG discussion
[@@to be done@@]
XML discussion
[@@to be done@@]
There is no concept of conformance to this document. Rather, conformance is measured relative to the checkpoints of the companion document, "QA Framework: Operational Guidelines" [QAF-OPS]. This document relates real-world examples to the checkpoints' requirements, and also presents techniques by which the checkpoints may be satisfied. In that sense, it defines conformance criteria for the conformance requirements (checkpoints) of the operational guidelines document.
The following QA Working Group and Interest Group participants have contributed significantly to the content of this document: