Meeting minutes
Media synchronization use cases and user needs.
jasonjgw: Updated wiki page in view of our discussion.
Transcript wiki at: https://
jasonjgw: Some of these capabilities seem to exist already
SteveNoble: YouTube uses its own proprietary technology to support the interaction where users can review the video transcript, search for a particular word, and then navigate to that part of the video
<jasonjgw> Steve investigated, noting that the support for relevant capabilities is proprietary in YouTube.
<jasonjgw> Steve: it supports searching and navigation to the point where the search term is found in the video. As far as Steve has determined, this capability isn't directly supported in existing Web specifications (WebVTT, TTML).
raja: Google has some capabilities as well using VTT
raja: Times transcripts are easier to add with ASR
jasonjgw: Not sure if TTML or WebVTT specifications are well supported and which might be best to concentrate for new capabilities
janina: WebVTT was a Google tech originally
janina: Both are W3C specs so we can read them
janina: Will need to differentiate between captions and transcripts
Raja: The main thing is if they are inside the video or outside the video
janina: Some video containers may be able to include these assets, may need to investigate
Raja: There are differences in the way these assets are stored...whether it is streamed or a recording
janina: We need to be clear that we need something more than just the verbal steam
Links from Raja: https://
https://
https://
janina: Finding what the Metraska (sp?) spec is and what it supports
janina: Also MP4
Raja: will look into this
Raja: All major browsers support webvtt and to show captions or timed transcripts. A major open source web based video player supports showing both captions and timed transcripts
https://
janina: We discuss some of this in the MAUR
jasonjgw: Will look at the MAUR and we can cite that
jasonjgw: We can bring back next week
Collaboration Tool Accessibility User Requirements.
jasonjgw: Updates?
<janina> https://
<janina> https://
janina: So, there was some confusion on what source document to examine, and COGA was looking at the wiki rather than the FPWD
scott_h: Have found Microsoft collaboration documents simpler than Google docs
jasonjgw: Have noticed that Microsoft has been making major changes at high frequency, so perhaps they will be more inclined for accessibility updates in this regard
jasonjgw: They have integrated some automated helps on document editing, so that is something else to look at in the AI discussion
janina: We seem to be on the right track now on getting comments
janina: Expect comments in perhaps 3 weeks
scott_h: Have been looking more on this and have some contributions to make
Miscelaneous updates and topics.
XR and accessibility
jasonjgw: Need to look at the Benetech grant, but still waiting for an email response
Metaverse
janina: Discussed in recent leadership meeting. Desire to pull together related specifications and best practices on accessibility in the metaverse
janina: Desire to identify anything that is at least FPWD
janina: Need to get this done in 2 weeks
jasonjgw: We have some of this in some of our documents, like SAUR and Natural Language
Raja: Both time and space. Time, speech and video are typical to track, but with XR one also has to track space
janina: Good to consider updates to these documents with XR in mind
jasonjgw: Can bring back Metaverse standards discussion next week
jasonjgw: Future of Interface workshop recordings are available. Would be good to investigate
<jasonjgw> https://