<scribe> scribe: Joshue108
JW: Looks for first on list
We are due to talk about TPAC meetings
We had a bunch of meetings and today will go through outcomes etc
We have discussion on sync issues with APA
and it became clear that our contributions, noting Steve Nobles work, is to be continued
JS: Nigel M makes the point about the need for more research a la second screen
We are on the right track
to drive change in W3C specs
Am very pleased about that..
<mentions the BBC work they do on video descriptions in text files etc>
You could pause media and consume alternate content
JS: Is going to be working on
minute clean up
... This leads to the idea of pausing the media to enable the
slowest device to catch up
Someone using a Braille device to read captions, language trans - or texted video descriptions
JS: This is in the MAUR
We need to emphasize and may need an API
JS: Does the MAUR capture this?
If not we need to discuss.
JS: I'd want to be more emphatic about it
We have a lot in email.
<TTS speed is varied - and every video description is an extended video desc>
Maybe no technical difference.
If studying lips you may want to do this.
JW: Was anyone else present beyond Jason, Josh Janina
<jasonjgw> Josh: notes the need to define acronyms such as MAUR in cross-group meetings.
<jasonjgw> Judy concurs.
<jasonjgw> Josh: notes a GitHub link.
<jasonjgw> Josh: we should be aware of synchronization tolerances in user requirements.
JS: Three points about sync in HMTL
Timed text is on board if we want to be specific
both TTML and WebVTT support millisecond time specifications
We have sufficient accuracy
JW: Granularity
JS: Yes, we need authors to be careful, its not being done.
And we also need UAs to keep things in tolerances.
JOC: There may also be issues around exposing AT data to JavaScript
JW: That would be necessary
around syncronisation
... I'm not convinced
Can be done on the client
<scott_h> apologies everyone - need to attend to a family matter
JS: May be a flag.
Captions are used widely in bars and gyms
JW: The point of pausing media - you just need an event that signitfies when the user is finished.
The UA doesn't need to know
JS: Like a pause button on remote
http://bbc.github.io/subtitle-guidelines/#Synchronisation https://usercontent.irccloud-cdn.com/file/oEjBx7EP/image.png
http://bbc.github.io/subtitle-guidelines/#Synchronisation
JS: Yes we are progressing
JW: Steve or Scott can you update?
SN: We had more resources that I've not looked at yet.
I do plan to revisit this week and make more progress
JS: Thanks for clarification
SN: Have found good resources etc
JW: Will put it on agenda for next week.
JS: We are still early days of researching..
not even gotten to @alt content representations
There are issues around how much something matters, and how close captions are etc
How long to captions need to be etc?
To be effective
SN: Agreed
JW: They are developing a range of capabilities
An interest for me is the API that would disclose the raw media to the browser and to Machine learning apps
They noted, image, speech and emotion recognition
These directions do have disability implications
They also commented on the RAUR
What else?
JS: We can work with them over time.
JW: We had a look at an open source app that transmits messages between SiP and RTC
JS: They have a gateway in that server
JW: There seemed to be difficulty finding good SiP clients
<mutual interest in * SiP>
JS: Discussion on emergency COMMs
etc
... We made our points about pinning.
And the UA capturing those streams and not the primary streams.
We will get it fed in.
JS: We need to work with them a little more.
<jasonjgw> Josh: notes the need to separate out requirements documented in the RAUR for different W3C groups.
<jasonjgw> Josh notes that RAUR is not a WebRTC-specific document.
JS: We have second screen for
example, and we have needs that can be helped with other
APIs
... Really it is to help the WebRTC group and yes, it applies
to the MAUR or other specs
but the RAUR is feeding a need and good as is.
We see the relations between areas - and we need to work out responsibilities etc
JS: One of the observations is that protocols and APIs need to be written to support things directly
Application guideance vs Specification guideance
So we need to look at these differences
In the Edu context they may develop WebRTC app for various things
There are various applications that need to support a11y
Should we be looking at his?
JS: Don't know - may be RAUR, may be something else.
Could be an EO best practices thing for example
We need to make sure the RAUR is supporting what we need
Needs to be in there.
JW: We should have that reviewed as we go to see if the pieces are supported.
JS: That is true for MAUR, XR stuff etc
JOC: So are we reaching completion with the RAUR?
JS: I think so.
JW: Ok - if we are fine.
JS: Sees no reason to change
JW: We could document that the *AUR suite, needs to be tested or satisfy other specs such as WCAG
So the implementation of a11y requirements becomes a matter for SIlver etc
The application authors need to have the underlying plumbing etc fit for purpose.
JS: Thats why we had Silver at these meetings
JW: This was interesting - as they have work taking place a la a11y use cases but also vocabs that describe interactions with a range of devices.
JS: I'm looking forward to bringing this up today on APA
They look like they are reinventing some previous work that we did in a an older spec.
We have been here before, so lets look at V1.
INCITS V2
Lead by Gregg V
Jw: The Universal Remote console?
JS: Thats one thing..
So devices could interact and support each other
JW: All built on XML, Gottfried also published on this..
JS: It was his doctorate.
JW: We will take this up at an APA meeting and discuss here.
JS: We have a basis for further
conv and highly receptive.
... I'll find a pointer.
Only one of the specs though..
JS: There is one with 24 72 etc
https://github.com/w3c/wot-usecases/issues/64
JOC: I think we substantially contribute to the use cases doc for WoT - it may already be mostly written
JS: Possibly, want to talk to
Gottfried.
... We were most prepared - it was a quick meeting.
... We think we are done.
JW: The COGA group will look at XAUR and contribute.
https://github.com/immersive-web/dom-overlays
JOC: I'm going to review
JS: MIke Crabb is also
looking
... Anything from here to note?
... Regarding CSS and Privacy
<jasonjgw> Josh: notes privacy issues raised in the CSS meting.
JS: Am not sure who owned that module
JW: Thats right
JS: It keeps turning up..
the train has left the station here.
JOC: Where is the locus of control here?
If done in this way it has a brittle architecture
JS: Big data.
... This isn't in our remit
... We can input from a11y perspective only
... Nothing from me
JW: Lets talk about this again early next week to get input from Scott.
JOC: Lets do that.
JS: We can improve how we present
all this for next year.
... We will pick this up in two weeks
No meeting next week
This is scribe.perl Revision of Date Check for newer version at http://dev.w3.org/cvsweb/~checkout~/2002/scribe/ Guessing input format: Irssi_ISO8601_Log_Text_Format (score 1.00) Default Present: jasonjgw, SteveNoble, scott_h, nicolocarp, Joshue Present: jasonjgw SteveNoble scott_h nicolocarp Joshue Joshue108 Found Scribe: Joshue108 Inferring ScribeNick: Joshue108 Found Date: 21 Oct 2020 People with action items: WARNING: Input appears to use implicit continuation lines. You may need the "-implicitContinuations" option. WARNING: IRC log location not specified! (You can ignore this warning if you do not want the generated minutes to contain a link to the original IRC log.)[End of scribe.perl diagnostic output]