15:00:57 RRSAgent has joined #tt 15:01:01 logging to https://www.w3.org/2024/06/06-tt-irc 15:01:02 RRSAgent, make logs Public 15:01:03 Meeting: Timed Text Working Group Teleconference 15:01:20 Agenda: https://github.com/w3c/ttwg/issues/283 15:01:26 Previous meeting: https://www.w3.org/2024/05/23-tt-minutes.html 15:01:43 Regrets: Gary 15:01:46 Chair: Nigel 15:01:50 scribe+ nigel 15:01:57 Present+ Nigel 15:02:02 cpn has joined #tt 15:02:08 scribe+ cpn 15:02:32 Present+ Chris_Needham, Matt, Pierre, Mike, Ewan 15:03:00 Present+ Cyril 15:03:05 Topic: This meeting 15:03:59 Present+ Andreas 15:04:22 Nigel: Much like last time! 15:04:34 .. DAPT: some stalled work, can we un-stall it? 15:04:41 Present+ Atsushi 15:05:36 ... There's a TTML issue and, from last time, we need to start a CfC to publish a CR draft of TTML2 15:06:16 ... The TTML2 PR needs review 15:06:30 ... Draft TPAC schedule has been published 15:06:34 ... AOB? 15:06:52 (nothing) 15:07:10 Topic: DAPT 15:07:28 atai has joined #tt 15:07:55 Subtopic: Add section about mapping from TTML to the DAPT data model w3c/dapt#216 15:08:05 github: https://github.com/w3c/dapt/pull/216 15:08:26 Nigel: This has been open for ages, and stalled, so need to decide what to do 15:08:44 ... I think we've done everything we said we'd do, but needs review to confirm that 15:09:10 ... I added another small change, on the table formatting because of changes to ReSpec 15:09:25 ... So pragmatically, I applied the new table styling in this PR, as well as support for dark mode 15:10:28 Nigel: Looking at the wording for pruning foreign vocabulary and [...], I didn't think that was testable so didn't want an extension feature 15:10:36 ... Happy to hear other opinions 15:10:46 .... Cyril, could you review, or anyone else? 15:12:22 ... I think this is the last significant thing to do before we can go to CR. This addresses #110. With #44 we can close with no change, and lastly, #75, per-script type restrictions. We may have enough 15:12:59 Cyril: We discussed last time, should we merge "represents" with the script type? 15:13:26 Nigel: That's #227, not marked as must-have. Not sure we have an answer yet 15:13:44 Cyril: If we adopt it, would be a non-backwards compatible change, so good to resolve before CR 15:13:55 Nigel: Have marked it as CR must-have 15:14:39 Nigel: Please add comments to the issue, to discuss next time 15:14:46 Cyril: Sure 15:14:54 MattS has joined #tt 15:15:23 SUMMARY: Needs review 15:15:25 Subtopic: Required metadata field for earliest SMPTE time code to allow conversion between DAPT and ESEF w3c/dapt#232 15:15:32 github: https://github.com/w3c/dapt/issues/232 15:15:54 Nigel: We discussed this last time, I had an action to create a PR, but not had time yet 15:16:32 ... It's worth discussing again 15:18:02 Ewan: A problem I found converting between ESEF and DAPT is with timeline references, you need at least one shared timecode value in the DAPT 15:18:29 ... the time codes are all relative to the media in DAPT, so without the value it's impossible to accurately convert between both formats 15:18:39 ... so looked for a value we could share, but didn't find one 15:18:59 ... EBU-TT has a first frame in programme 15:19:24 ... the ESEF format does have field, but it's not implemented in a common authoring platform, so files won't have it 15:19:45 ... So add a new metadata field for the first frame of the content, which would be common to any exchange format 15:20:02 ... Not clear if it should be in DAPT or drawn from another spec like TTML2 15:20:32 Nigel: There's a compilation process that happens, where the input to it, in broadcast workflows, is expressed in SMPTE time code 15:20:41 ... used for synchronisation in playout 15:21:13 ... so although we don't have SMPTE timecode in DAPT, if you're generating a file with the AD content in it, you need to associate the timeline with the SMPTE timecode 15:21:53 ... This common example, where you don't know all the info, there's one piece of data missing, this proposal is to add DAPT metadata to say where time 0 matches some SMPTE time code 15:22:18 ... So rather than expressing all times in DAPT in SMPTE timecode, have one point in time as a cross-reference 15:22:51 Mike: I wonder if using a timecode that has gaps is a harder problem to solve, and if it would be more productive to do in a DASH context, where it's broadly understood 15:22:55 q? 15:23:23 ... For timed text we don't permit there to be gaps, e.g., a track such as a wave file with DAPT in it, it's not OK to have it start/stop, put in a null segment 15:24:04 Nigel: I'm not sure I understand how that would work. How would you generate DASH that knows this. This is before decoding and packaging 15:24:12 ... Agree that the audio file has to be continuous 15:24:39 ... It needs to have the same play rate in the media as the resulting compiled audio file so you can play them in sync 15:24:52 ... DASH doesn't have SMPTE time code? 15:24:59 Mike: No, time of day in UTC 15:25:25 Nigel: If you had an external wrapper for DAPT, you could put additional info in it 15:25:44 Cyril: It should be possible to do a lossless round trip, at least 15:26:00 ... even if with external vocabulary 15:26:23 Nigel: That's the key question, should it be external or natively supported in DAPT, as it'll be a common issue and we could solve in a common way 15:26:41 Cyril: How much vocabulary would it pull it, can we add just that one attribute? 15:26:48 atai has joined #tt 15:26:50 Nigel: You can 15:27:26 ... The value in the EBU-TT metadata spec isn't exactly what we need, there isn't one that relates exactly to this 15:27:51 q+ 15:28:16 ... Document start of programme in #1 but that info isn't available in ESEF, so you can't map it, but also can't rely on the DAPT media timeline being the start of the programme 15:28:34 q+ 15:28:37 ... You don't know where on the programme content timeline where that is, as the start of programme timecode is missing 15:28:52 ... so becomes a circular dependency 15:29:26 Matt: Makes sense to me, there's an offset value in BWAV where you can calculate start of programme 15:30:02 ... Hard to have a series of timed events, they always refer to another audio file or audio track in another media file, so borrowing document start of programme makes sense 15:30:26 Nigel: That feels like an interesting misuse, as time of first description may be a minute into the programme 15:30:53 ... So if you use document start of programme as start of first AD... 15:32:00 Cyril: Introduce an empty DAPT event before the first, then use start of programme for that. If we were to use this as a hack, the first description in the DAPT document would have the correct semantics for start of document? 15:32:04 Nigel: Don't think you would 15:32:49 ... You don't know how the AD in the file relate to the start of the programme in the original media. We lose the relationship with the timeline, so need a way to recreate it 15:33:19 ... My goal was to propose some data or metadata to say that time 0 in the media timeline corresponds to some SMPTE timecode, to rebuild the relationship between the timeline 15:33:31 Matt: Works nicely with how our BWAVs work 15:33:49 q? 15:33:54 Nigel: Look at how those two concepts coincide 15:34:19 Matt: We can use to synchronise without a sidecar XML file 15:34:41 Pierre: I've seen people get in trouble doing that, the value is meaningless outside the context of the timed text file 15:35:02 Matt: It's in the compiled WAV file, agree it goes wrong if you mix and match 15:35:27 Pierre: Use a playlist, don't hard-code into individual components of a playlist, from my experience 15:35:39 Andreas: How would this be resolved with playlists? 15:36:05 Pierre: If you have two separate componetns in a media playback, they way to relate their relative offsets is through a third object like a playlist 15:36:24 ... Alternative is to have multiplexes, to tightly bind the essence components 15:36:54 ... But as soon as they're not tightly bound they get separated, reused, so binding by inserting info individually stops working IME 15:37:08 Matt: Challenge here is they come from different suppliers and different processes 15:37:29 ... Those suppliers needs some way to have the relationship between the timelines 15:37:53 Pierre: The playlist would do that. Doesn't have to be an external file, could be an API 15:38:17 ack M 15:39:05 Nigel: Interesting point, unless they're tightly bound. The AD script and the original media are tightly bound, it's a 1:1 relationship 15:39:32 ... The scenario is more specific, and reliably specific, than the general case where you see those problems 15:40:31 ack at 15:40:40 Andreas: I understand both positions. The metadata can be meaningless or out of control of how you exchange the AD. So it's at the risk of the user to interpret the metadata and restrict the workflow 15:41:01 q+ to ask if the "compilation" timecode could be provided as an input into the conversion from ESEF to DAPT 15:41:16 ... I commented on this last time, the timecode of the first content in the AD isn't new, it's in EBU STL or EBU subtitles, time of first cue 15:41:53 ... If makes sense to add metadata to refer to the zero timecode, could also be used for other things, and DAPT could be used for other TTML profiles 15:42:06 ... If we use this kind of metadata, good to define in a way that refers not only to DAPT 15:42:09 q? 15:42:27 ack n 15:42:27 nigel, you wanted to ask if the "compilation" timecode could be provided as an input into the conversion from ESEF to DAPT 15:42:33 Nigel: I want to make another suggestion, don't know how feasible it is 15:43:32 ... At the moment, the compilation gives a single continuous output media, with a timepoint expressed in timecode. Could that be provided as an input in the conversion from ESEF to DAPT, provided earlier, so that defines time 0. Then you don't need anything in DAPT as that defines the time of the output 15:44:00 Cyril: This question does seem applicable to more than DAPT, should discuss in context of TTML2 15:44:24 Nigel: We can do that, but I'm try to reframe it to make the problem disappear 15:44:47 Matt: Unless you're producing a BWAV, the WAV has no concept of timecode, so descriptions are offset from 0 15:45:17 ... When you want to consume that file downstream, the challenge is how does the consumer how it relates to the asset it belongs to? 15:46:25 Pierre: In my mind that's a workflow issue. Whoever is producing the wave file needs source material. Can be done in different ways, text with a playlist, a web player. There's some context that the wave file is part of 15:46:41 ... So the workflow is in charge of making sure those things stay synchronsied 15:47:02 Matt: For us that's a proxy file, which must have the same timeline as the target 15:48:01 Pierre: One way to achieve that is send the proxy and require whoever creates the wave file to route it back into the proxy, so there's no ambiguity on the relationship between the two. Reingest the created audio essence back in to their assent managmenet system 15:48:21 ... Then the playlist provides uniambugity in the relationship between them 15:48:58 ... Or use a web based application that includes the original cnotent, the proxy, then it's all dont behind the scenes and the relationship is preserved by the system 15:49:29 Nigel: There's a legacy problem here, there's a large number of ESEF AD files, exist independently of any workflow or asset management system 15:49:50 ... If you have the original media, you can relate them, but you may not have access to that when you want to convert to a different format 15:49:55 ... That's part of the challenge here 15:50:12 ... So going back to my original question of providing the data upfront, you can't because you don't have it 15:50:53 ... If you want to avoid having additional metadata, the conversion task has to look it up from somewhere else, and that may not be easily accessible 15:52:24 Ewan: Yes. My feeling is, i the absence of the data to convert a script to the DAPT file, you'd have an archive of ESEF files, so would extend the live of the ESEF standard. A service provider, trading in scripts from other providers using non DAPT you may not be able to exchange without that additional context 15:53:50 Nigel: We could express as metadata, and deferred processing, rather than making it a fixed offset. Does that tie it too closely to a specific process, not generic enough? 15:54:31 Matt: Our files have a content start time and content end time, in the ESEF header 15:54:50 ... Relies on the describer putting the data in 15:55:19 ... But if we have a wav file that doesn't match the duration of the content, things go wrong 15:55:30 Ewan: That's #230 15:55:55 The compiled wav may extend beyond the content end time 15:56:16 ... Not always possible to populate the value 15:57:11 SUMMARY: Issue discussed, alternative workflows considered, potentially frame as "deferred conversion data" or similar. 15:58:20 Topic: Metadata attributes apply as well as elements w3c/ttml2#1273 15:58:27 github: https://github.com/w3c/ttml2/pull/1273 15:58:32 SUMMARY: Review needed 15:58:55 Topic: TPAC 2024 15:59:15 -> Draft TPAC 2024 timetable 15:59:30 s/timetable/timetable https://www.w3.org/2024/05/tpac2024-schedule-20240523.html 16:00:43 Nigel: TTWG meets on the Friday, joint meeting with APA. Overlaps with MEIG Monday joint meeting with MEIG. 16:01:16 ... Joint meeting with ADCG. Schedule looks awkward. Any comments or requests 16:01:29 Topic: Meeting close 16:01:36 s/requests/requests?/ 16:01:59 Nigel: Thanks everyone. Apologies for being slightly over time. [adjourns meeting] 16:02:14 rrsagent, make minutes 16:02:15 I have made the request to generate https://www.w3.org/2024/06/06-tt-minutes.html nigel 16:04:43 s/[...]/constraints on ttp:contentProfiles values 16:07:48 s/Look at how those two concepts/We should look at how those two concepts 16:08:11 s/componetns/components 16:10:14 s/assent managmenet/asset management 16:11:06 s/playlist provides uniambugity in the relationship/playlist provides the unambiguous relationship 16:11:18 s/all dont behind/all done behind 16:11:44 s/i the absence/in the absence 16:12:03 s/live of the ESEF/life of the ESEF 16:12:29 rrsagent, make minutes 16:12:30 I have made the request to generate https://www.w3.org/2024/06/06-tt-minutes.html nigel 16:13:51 scribeOptions: -noEmbedDiagnostics -final 16:13:54 zakim, end meeting 16:13:54 As of this point the attendees have been Nigel, Chris_Needham, Matt, Pierre, Mike, Ewan, Cyril, Andreas, Atsushi 16:13:56 RRSAgent, please draft minutes v2 16:13:58 I have made the request to generate https://www.w3.org/2024/06/06-tt-minutes.html Zakim 16:14:04 I am happy to have been of service, nigel; please remember to excuse RRSAgent. Goodbye 16:14:04 Zakim has left #tt 16:14:21 rrsagent, excuse us 16:14:21 I see no action items