IRC log of wicg on 2019-09-19

Timestamps are in UTC.

00:12:35 [RRSAgent]
RRSAgent has joined #wicg
00:12:35 [RRSAgent]
logging to https://www.w3.org/2019/09/19-wicg-irc
00:13:17 [Travis]
.. joining the meeting in progress
00:13:37 [Travis]
SubTopic: Support for certs not anchored to WEb PKI (Self-signed)
00:14:14 [Travis]
..
00:14:32 [Travis]
SubTopic: provide a way to specify a subprotocol
00:14:53 [Travis]
It's not trival to find a place to put this in the protocol.
00:15:42 [Travis]
Thought: put in first string, but not block data.
00:15:51 [Travis]
Could do this without losing RTT
00:17:03 [Travis]
But don't want to add extra complexity
00:23:28 [Travis]
Two other issues not covered: conjestion control and stream prioritization
00:24:13 [iclelland]
iclelland has joined #wicg
00:24:32 [wonsuk]
wonsuk has joined #wicg
00:24:54 [yoav]
q?
00:25:18 [jbroman]
Topic is now WebCodecs
00:25:26 [jbroman]
Peter Thatcher: does anyone want an intro?
00:25:31 [Travis]
Scribe: jbroman
00:25:32 [jbroman]
(response is yes)
00:25:46 [yhirano_]
yhirano_ has joined #wicg
00:26:01 [jbroman]
low-level API for media encode/decode
00:26:08 [jbroman]
based on transform streams
00:26:33 [jbroman]
useful for live streaming, faster-than-realtime encoding, realtime comms, ...
00:27:04 [jbroman]
VideoEncoder takes image data and makes a bitstream
00:27:23 [jbroman]
typically gotten from a MediaStreamTrack via VideoTrackReader
00:27:43 [jbroman]
not strictly part of web codecs but pulled in as a dependency
00:28:29 [jbroman]
slide example of constructing audio/video decoders from some readable streams
00:28:40 [jbroman]
and plugging that into a <video> element
00:29:08 [jbroman]
slide example of using this to transcode media by piping between decoder and encoder
00:29:10 [Travis]
q+ to ask about codec discovery if not covered in the presentation...
00:30:14 [jbroman]
slide comparing to MediaRecoder: more low-level, ...
00:30:36 [jbroman]
similar comparison to MSE
00:31:07 [jbroman]
comparison to WebRTC: more low-level, not tied to network transport, can choose what kind of transport you want
00:31:17 [jbroman]
buffering is done by the application, not browser
00:31:31 [jbroman]
all of this can be done by wasm but this is faster and more power-efficient
00:31:44 [jbroman]
access to hw encode/decode
00:32:14 [jbroman]
track reader/writer would be useful for wasm codecs too, avoids having to use <canvas>
00:32:35 [yoav]
q+ on images
00:32:43 [jbroman]
proposal, explainer and webidl but no spec
00:32:46 [jbroman]
impl starting soon in chromium
00:32:47 [jbroman]
q?
00:32:52 [jbroman]
ack Travis
00:32:52 [Zakim]
Travis, you wanted to ask about codec discovery if not covered in the presentation...
00:33:10 [jbroman]
Travis: can you enumerate supported codecs?
00:33:20 [jbroman]
Peter Thatcher: that's the first issue under capabilities
00:33:36 [jbroman]
we want to use media capabilities; don't want to duplicate it
00:33:45 [jbroman]
there are specific needs to add to capabilities
00:33:59 [jbroman]
e.g. low latency vs b-frames
00:34:12 [jbroman]
a lot of people want to explicitly choose hw or sw encoder, since some hw encoders are bonkers
00:34:29 [jbroman]
for low-level things like svc, there is no support inside MediaRecorder or MSE but we want it
00:34:45 [jbroman]
concerned about expanding the fingerprinting surface by telling more to the app
00:34:59 [jbroman]
Travis: have you approached them?
00:35:08 [jbroman]
Peter Thatcher: yes, and they're engaging
00:35:16 [jbroman]
Travis: agree with general sentiment
00:35:17 [jbroman]
q?
00:35:19 [jbroman]
ack yoav
00:35:19 [Zakim]
yoav, you wanted to comment on images
00:35:34 [jbroman]
yoav: you mentioned image encode/decode; I suspect this will be an ask
00:35:43 [jbroman]
what is the current status?
00:35:57 [jbroman]
Peter Thatcher: there was a lengthy discussion; even made a pr
00:36:15 [jbroman]
the more we dug into the use cases people wanted, it became unclear whether it was worth it vs wasm or having a one-frame video
00:36:20 [jbroman]
currently consider it out of scope
00:36:38 [jbroman]
it occurred after a <img>/<video> discussion yesterday that if we could take decoder output and put it in an <img> that would be good
00:36:51 [dharani]
dharani has joined #wicg
00:37:00 [jbroman]
yoav: I think img/video discussion is different because if app developers are involved, they can just have a muted <video> in all cases except background-image
00:37:08 [jbroman]
in background-image case this solution wouldn't help anyway
00:37:25 [jbroman]
I don't think that covers the use cases people have when they want to put videos into <img> elements
00:37:36 [jbroman]
you're probably correct that people can do things with one-frame videos
00:37:47 [jbroman]
at least in image formats today
00:37:58 [jbroman]
there might be an image format in the future that is not 100% identical to video format even if inspired
00:38:05 [jbroman]
why does it matter if it's one-frame video or image format?
00:38:13 [jbroman]
Peter Thatcher: one interesting case is gifs
00:38:22 [jbroman]
somebody wanted to encode (animated) gifs
00:38:26 [jbroman]
yoav: as an encoding output?
00:38:28 [jbroman]
Peter Thatcher: yes
00:38:31 [jbroman]
would we call that a video?
00:38:38 [jbroman]
there is no containerization right now
00:38:44 [jbroman]
is there such a thing as a containerized gif?
00:38:59 [jbroman]
images always have containerization; for video it's nice to have bistream w/o container
00:39:03 [jbroman]
yoav: ok
00:39:05 [jbroman]
q?
00:39:13 [jbroman]
Steve Anton: in summary it seemed like scope creep
00:39:21 [jbroman]
can do image encode/decode with current apis
00:39:35 [jbroman]
yoav: would be interested to hear the use case for gif; do we really want more animated gifs?
00:39:43 [jbroman]
???: lots of internet traffic is memes
00:40:00 [jbroman]
Travis: drill into answer about existing way to do image encode
00:40:06 [jbroman]
Steve Anton: you can write it to a canvas
00:40:36 [jbroman]
Travis: want to be able to bring image payload down from server and decode it later, and recapture the memory later if it goes out of view
00:40:42 [jbroman]
want to manage that at a fine-grained level
00:40:53 [jbroman]
rbyers: didn't we recently ship something like that (jsbell?)
00:41:04 [iclelland]
iclelland has joined #wicg
00:41:33 [jbroman]
rbyers: HTMLImageElement.decode; forces eager decode
00:41:41 [jbroman]
Travis: can you provide user-defined codec?
00:41:47 [jbroman]
Peter Thatcher: yes, injectable codecs
00:42:06 [jbroman]
can use simd extensions to wasm etc, but need to get it into/out efficiently
00:42:12 [jbroman]
that's where track reader/writer comes into play
00:42:18 [jbroman]
we discussed this at a breakout session
00:42:26 [jbroman]
want to allow injectable codecs in WebRTC API
00:42:31 [jbroman]
Travis: can this be done off-thread?
00:42:38 [jbroman]
Peter Thatcher: should be possible, in a worklet
00:42:55 [jbroman]
Steve Anton: Web Codecs is more about exposing the platform codecs
00:43:11 [jbroman]
though there is interest in allowing injectable codecs in MSE, etc
00:43:23 [jbroman]
it is more relevant to those APIS than to Web Codecs
00:43:55 [jbroman]
Amit: don't you want it to work the same way as web codecs?
00:44:12 [jbroman]
Peter Thatcher: similar API shape, could use Web Codecs or custom codecs
00:44:22 [jbroman]
Steve Anton: could imagine constructing a codec on a worker, using it on main thread, etc
00:44:49 [jbroman]
s/Amit/Amit Hilbuch/
00:45:04 [jbroman]
Peter Thatcher: Next issue: some codecs require extra out-of-band data, like H.264 SPS/PPS
00:45:27 [jbroman]
???: isn't this usually embedded in the container?
00:45:40 [jbroman]
Steve Anton: could be, yes
00:45:57 [jbroman]
Peter Thatcher: usually extracted from container format and then passed to decoder
00:46:15 [jbroman]
(example shows it passed as options to the VideoDecoder ctor
00:46:16 [jbroman]
)
00:46:40 [jbroman]
Steve Anton: this is data necessary for decoder itself
00:46:54 [jbroman]
Amit Hilbuch: should these be specified in IDL or be freeform?
00:47:05 [jbroman]
Peter Thatcher: in WebRTC we have key-value pairs; would probably mimic that
00:47:11 [jbroman]
Steve Anton: settings are codec-specific
00:47:31 [jbroman]
Amit Hilbuch: agree, wouldn't want to bake specific codec parameters into web standard
00:48:08 [jbroman]
Steve Anton: is it fine to specify statically at decoder construction, or would they need to change later on, e.g. per-chunk
00:48:31 [jbroman]
Amit Hilbuch: how are these different from ???
00:48:41 [jbroman]
Steve Anton: distinction between things needed to understand bitstream vs encoder hints
00:49:28 [jbroman]
Steve Anton: are there parameters needed to understand the bitstream that are not known when the decoder is constructed?
00:49:33 [jbroman]
need to consult codec experts
00:49:41 [jbroman]
Peter Thatcher: next issue: decoding b-frames
00:49:55 [jbroman]
in video codecs there are dependencies between frames
00:50:00 [jbroman]
b-frames depend on future frames
00:50:03 [jbroman]
need to decode out of order
00:50:12 [jbroman]
how do we support this in the decoder api?
00:50:20 [jbroman]
yoav: buffering?
00:50:45 [jbroman]
Peter Thatcher: require JS to handle it by getting it in decode order; JS can reorder after
00:50:55 [jbroman]
but hardware decoders reorder internally
00:51:07 [jbroman]
Steve Anton: feels like most decoders could efficiently reorder for you
00:51:11 [jbroman]
even sw decoders
00:51:23 [jbroman]
Peter Thatcher: do we make the API such that decoders can do it more efifciently?
00:51:32 [jbroman]
so far we've added a timestamp to the info going into the decoder
00:51:37 [jbroman]
so that it can do the buffering
00:51:41 [jbroman]
but if you opt into low latency
00:51:46 [jbroman]
then this timestamp would be unimportant
00:51:58 [jbroman]
Steve Anton: requiring timestamps doesn't seem too onerous if you don't need them
00:52:01 [jbroman]
can just increment by 1
00:52:08 [jbroman]
Travis: seems like surrogate pairs
00:52:12 [jbroman]
where it's useless to get half of it
00:52:21 [jbroman]
almost always just want full character in e.g. HTML parser
00:52:40 [jbroman]
is the b-frame an implementation detail, or does it need to be exposed?
00:52:49 [jbroman]
Steve Anton: wouldn't get partial images; just get frames out of order
00:52:57 [jbroman]
would get keyframe and then partial frames that come before
00:53:14 [jbroman]
would the decoder expose the keyframe or wait until after the delta frames have been emitted
00:54:09 [jbroman]
yoav: I would ask developers (a) is there an actual use case to get the frames out of order? (b) what is their preference? do they want the API to magically handle that or not?
00:54:17 [jbroman]
Peter Thatcher: that's a good idea to ask on discourse
00:54:21 [jbroman]
pretty sure I know what they will say
00:54:31 [jbroman]
"do you want to do extra work or do you want us to?"
00:54:41 [jbroman]
yoav: would it be more efficient in some way if it's done in js? doesn't seem so
00:54:55 [jbroman]
???: which timestamp?
00:55:01 [jbroman]
Peter Thatcher: presentation timestamp
00:55:27 [jbroman]
Peter Thatcher: next issue: complex coding modes
00:55:32 [jbroman]
e.g. SVC
00:55:37 [jbroman]
SVC is Scalable Video Coding
00:55:44 [jbroman]
progressive images, but for video
00:55:49 [jbroman]
different "layers" that depend on each other
00:55:59 [jbroman]
useful in real-time communication, e.g. video conferencing
00:56:13 [jbroman]
surprised it hasn't been popular in livestreaming
00:56:19 [jbroman]
Steve Anton: also useful in offline playback
00:56:30 [jbroman]
Peter Thatcher: with only a portion of the bitstream, can decode a lower-quality version
00:56:40 [jbroman]
yoav: how does that work wrt video length?
00:56:45 [jbroman]
is it "for every x frames"?
00:57:23 [jbroman]
(overlapped discussion about how this is implemented, and complicated diagrams in a spec)
00:57:36 [jbroman]
Peter Thatcher: simple case has bigger frames dependent on smaller frames
00:57:40 [jbroman]
yoav: okay, so each frame is progressive
00:57:47 [jbroman]
Peter Thatcher: but it can get more complicated
00:57:51 [tantek]
tantek has joined #wicg
00:57:54 [jbroman]
Steve Anton: SVC is similar to JPEG progressive decode
00:58:12 [jbroman]
Peter Thatcher: from encoder API standpoint, put in 1 frame and get out n frames for same timestamp
00:58:15 [tantek]
RRSAgent, make logs public
00:58:31 [jbroman]
dependencies between them
00:58:36 [jbroman]
similar to time dependencies in normal delta frames
00:58:42 [jbroman]
but this impacts the api
00:58:51 [jbroman]
when decoding, get different frames for same timestamp
00:59:03 [jbroman]
maybe you get a higher-quality frame after having rendered the lower-quality frame
00:59:16 [jbroman]
1 frame in - 1 frame out no longer applies
00:59:21 [jbroman]
have tried to integrate into the api
00:59:29 [jbroman]
but expressing how you want it set up can be complex
00:59:39 [jbroman]
expressing what is possible through media capabilities is also complex
00:59:47 [jbroman]
yoav: is there a preference
01:00:00 [jbroman]
e.g. I'm a low-power device, only want partial layers?
01:00:11 [jbroman]
but client and server could have communicated that
01:00:27 [jbroman]
Peter Thatcher: in real-time comms, client asks for higher res version that is available according to network conditions
01:00:36 [jbroman]
yoav: device could say max quality it can handle
01:00:45 [jbroman]
Amit: that's in signaling
01:00:54 [jbroman]
yoav: maybe in media capabilities you want to know what's possible
01:01:05 [jbroman]
Peter Thatcher: might have a limit in how many layers you can do
01:01:10 [jbroman]
not just for video
01:01:15 [jbroman]
in audio, opus fec
01:01:28 [jbroman]
particular packet can contain current time's high quality audio and previuos time's low quality audio
01:01:36 [jbroman]
so if you miss a packet you can fill in with the low quality version
01:01:54 [jbroman]
so from an opus bitstream you can decode either high or low quality data
01:02:03 [jbroman]
Steve Anton: might be able to figure it out automatically
01:02:07 [jbroman]
based on gaps in timestamps
01:02:11 [jbroman]
Peter Thatcher: good point
01:02:14 [jbroman]
could treat it like SVC
01:02:28 [jbroman]
if you get the earlier one first and high quality one later, get a second audio frame at the same time
01:02:37 [jbroman]
Steve Anton: at encode side, just need a way to turn it on
01:02:40 [jbroman]
Peter Thatcher: agreed that it's easy
01:02:46 [jbroman]
decode is what's interesting
01:03:11 [jbroman]
Steve Anton: with SVC, multiple layers make up one frame
01:03:17 [jbroman]
if you expect to get all the layers you want to wait
01:03:37 [jbroman]
might need a way to flush what's currently available
01:03:56 [jbroman]
yoav: how can we facilitate this in the futuree?
01:04:05 [jbroman]
Peter Thatcher: might not want to implement all of this now
01:04:13 [jbroman]
but don't want to paint ourselves into a corner
01:04:18 [jbroman]
Steve Anton: can't assume frame in, frame out
01:04:30 [jbroman]
yoav: at the same time, a future application using those coding modes will be aware of it
01:04:38 [jbroman]
they will know that multiple frames can come out, etc
01:05:00 [jbroman]
Peter Thatcher: skipping Timestamps and time domains because Paul isn't here
01:05:42 [jbroman]
short version: when we go from readable streams to tracks, for audio want to give it to OS as needed; for video to keep it smooth you want to say when a video frame is presented
01:06:01 [jbroman]
if you just take frames from decoder as fast as they come out, it might not be smooth
01:06:12 [jbroman]
???: might need to drop frames to keep smooth?
01:06:31 [jbroman]
Peter Thatcher: how do you decide which frame goes out on a particular vsync?
01:06:48 [jbroman]
if we put JS in complete control, prone to JS pauses
01:06:59 [jbroman]
Steve Anton: concern already exists e.g. with WebGL
01:07:05 [jbroman]
so might not be intractable
01:07:12 [jbroman]
Peter Thatcher: might not be clear until we try to implement
01:07:15 [jbroman]
Paul has better ideas
01:07:41 [jbroman]
Todd: expected to be done off render thread?
01:07:57 [jbroman]
Peter Thatcher: yes; still may want to control what frame goes out on what vsync
01:08:03 [jbroman]
Todd: sounds like it might be more efficient
01:08:19 [jbroman]
Peter Thatcher: if latency isn't as important, buffering to provide smoothness might be more important
01:08:23 [jbroman]
vs real-time use cases
01:09:15 [Travis]
Thanks jbroman !
01:09:20 [jbroman]
Topic: information pipeline between WICG and web developers
01:09:30 [jbroman]
yoav: don't know if this is interesting to non-chairs
01:09:41 [jbroman]
Todd: what is the core of the question?
01:09:56 [jbroman]
yoav: I think discourse today is mostly used as a "bring us your proposal" forum
01:10:01 [jbroman]
not the most effective way to get stuff done
01:10:27 [jbroman]
would be interesting to use it more, or use another means more, to get use cases from developers
01:10:31 [jbroman]
and then iterate over API shape questions
01:10:46 [jbroman]
Todd: when you say use cases, you mean problems that are difficult or impossible to solve?
01:10:51 [jbroman]
and exactly what are you trying to do?
01:10:58 [jbroman]
rather than "I dreamed of this API shape"
01:11:10 [jbroman]
yoav: is that worthwhile to discuss in this forum?
01:13:45 [jbroman]
cwilso: is discourse the right entry point to wicg, or would something else make more sense? I drew the parallel in the AC meeting; there is 1:1 overlap between ???. The way we start things up is similar to WICG. We ask to see evidence that >1 person cares about this problem to give it legitimacy. Immersive Web Group has specific "proposals" repo. File an issue; everyone follows this repo. Sometimes we still have competing repos about interesting
01:13:45 [jbroman]
ideas. That's how we do it; we barely use our mailing list. Would that be easier than using discourse?
01:14:16 [jbroman]
I don't go to discourse often
01:14:20 [jbroman]
Travis: but you do go to github often
01:14:35 [jbroman]
cwilso: I do, but don't necessarily dig into the right repo every day
01:14:46 [jbroman]
Todd: how to keep web developers engaged, understand what are the issues?
01:15:02 [jbroman]
cwilso: process for starting in WICG is intended to be easy but still have a minimum bar
01:15:12 [jbroman]
we do ask that every incubation proposal has >1 organization
01:15:21 [jbroman]
if 15 Google employees say it's a good idea, that's not good enough
01:15:29 [jbroman]
but 1 web dev + 1 MS employee, that's good enough
01:15:38 [jbroman]
some measure of cross-industry interest
01:15:44 [jbroman]
are we using the right mechanism for that?
01:15:50 [Guest19]
Guest19 has joined #wicg
01:16:00 [jbroman]
yoav: there's the question of engagement, and beyond that, once you've got people engaged, what do you actually want from them?
01:16:23 [jbroman]
I think that I traditionally try to get web devs more involved in standards, but it may have been a mistake in the sense that people don't necessarily have time to do that on top of their job
01:16:50 [jbroman]
at the same time, an engagement that says "here is a problem space we're looking at; give us your use cases and pain points"
01:17:04 [jbroman]
and they just have to write 100 words and leave and don't have to follow up as a long-term commitment
01:17:13 [jbroman]
that is something that developers are actually likely to be interested in doing
01:17:17 [jbroman]
if we make it easy/welcoming enough
01:17:35 [jbroman]
Travis: I wonder how much a developer feels like they get to be a part of it if they just pop in, make a suggestion, and never come up
01:17:38 [jbroman]
what's missing is the follow-up
01:17:44 [jbroman]
that we turned your random idea into something complete
01:17:48 [jbroman]
wdyt? does it still work?
01:17:59 [jbroman]
How can we make sure that's working unless we keep them engaged?
01:18:01 [jbroman]
yoav: discourse, github
01:18:24 [jbroman]
once we have some API shape, we can ask them if it would address their use case or not
01:18:35 [jbroman]
we can try to have long-term engagement in terms of revalidating use cases, API shape, etc over time
01:18:37 [jbroman]
until shipped
01:18:41 [jbroman]
but doesn't require them to do a ton of work
01:18:55 [jbroman]
Travis: could be done through discourse or github, orthogonal to the question?
01:18:59 [jbroman]
yoav: not a tooling question
01:19:14 [jbroman]
do we want to take that approach? can pick a tool after that
01:19:31 [jbroman]
chairs, does that make sense as a browser developer?
01:19:46 [jbroman]
Steve Anton: web developers have a hard time getting excited about proposals
01:19:50 [jbroman]
they seem so far off
01:19:52 [jbroman]
first step is use cases
01:19:58 [jbroman]
devs happy to talk about use cases
01:20:19 [jbroman]
there's a gap where it's hard to get them involved in middle stages
01:20:21 [jbroman]
until origin trial
01:20:43 [jbroman]
cwilso: early part of "Web dev has a use case" is outside of this
01:20:52 [jbroman]
more like "web we want" effort
01:21:08 [jbroman]
let's turn early use cases into actionable problem statements for further incubation
01:21:12 [jbroman]
so many people come up with problems
01:21:21 [jbroman]
but thinking about how to solve in a broader space is hard sometimes
01:21:42 [jbroman]
I'm more concerned about having an active set of people, how do we get more people engaged in solution efforts?
01:21:51 [jbroman]
yoav: define "engage in solution efforts"
01:22:00 [jbroman]
cwilso: iterating on problem statements, solutions in incubation process
01:22:28 [jbroman]
a number of incubations had people saying it was interesting, turns into one person creating, proposing, implementing, experimenting w/o a lot of engagement
01:22:29 [jbroman]
that's tough
01:22:39 [jbroman]
not saying that's a totally broken pattern but shouldn't be the usual pattern
01:22:55 [jbroman]
Todd: if there are 5 people on discourse saying it's a good idea and one person begins work, how do all 5 people contribute?
01:23:03 [jbroman]
cwilso: even continually engaging after that
01:23:12 [jbroman]
one downside of discourse is that it's for initial stages only
01:23:18 [jbroman]
once there's a repo, it's disconnected from discourse
01:23:25 [jbroman]
conversation moves into a different tool altogether
01:23:29 [jbroman]
that seems odd
01:23:44 [jbroman]
Peter Thatcher: with Web Transport, had some people interested; wanted to have community discussion
01:23:50 [jbroman]
wasn't sure whether to use discourse, github, mailing list, ?
01:23:56 [jbroman]
lack of a way to keep interested people in the loop
01:24:07 [jbroman]
Todd: does an individual effort have a single discourse thread?
01:24:16 [jbroman]
cwilso: usually one, but once repo created there's no conversation there
01:24:29 [jbroman]
Todd: a bot to update discourse thread with # issues created etc?
01:24:35 [jbroman]
yoav: there are also labels
01:24:53 [jbroman]
theoretically you could have multiple threads and have them labelled
01:25:02 [jbroman]
Todd: do we think the discourse thread don't go to the github
01:25:06 [jbroman]
or do they not know? not have time?
01:25:08 [jbroman]
do we know why/
01:25:12 [jbroman]
cwilso: we don't truly know why
01:25:25 [jbroman]
part of it is that it's a barrier that we need to see engagement to create the repo
01:25:32 [jbroman]
engagement beyond that is a long-term interest thing
01:25:38 [jbroman]
human nature to have limited time/effort to expend
01:25:57 [jbroman]
superstar is co-chair Marco (sp?) who comments on everything which is great
01:26:15 [jbroman]
suspect if we used GitHub as that comm tool it might be easier to get interest from other people
01:26:25 [jbroman]
we could manage tagging/labelling things in the proposals repo
01:26:33 [jbroman]
so people could follow e.g. styling or networking proposals
01:26:48 [jbroman]
Todd: replacing disocurse with github?
01:26:50 [jbroman]
cwilso: yeah
01:26:55 [jbroman]
s/disocurse/discourse/
01:27:34 [jbroman]
Peter Thatcher: when people were interested in Web Transport/Codecs, had to tell people to come to discourse page and people weren't familiar
01:27:50 [jbroman]
maybe GitHub would be better
01:28:04 [jbroman]
yoav: one way to resolve question would be to ask it
01:28:09 [jbroman]
do a survey
01:28:20 [jbroman]
provide multiple engagement mechanisms and ask that they would prefer
01:28:25 [jbroman]
discourse vs github vs ??? with examples
01:28:52 [jbroman]
Peter Thatcher: have you talked to foms (sp?); they have engagement from web devs about media stuff
01:29:04 [jbroman]
whatever they're doing seems to do a good job
01:29:23 [jbroman]
Todd: web performance wg has attended velocity conference (now renamed)
01:29:26 [cwilso]
s/foms/FOMS: http://www.foms-workshop.org/foms2019SF/
01:29:40 [jbroman]
that did get some engagement
01:29:52 [jbroman]
yoav: view source next week would get a good place to collect anecdata
01:29:56 [jbroman]
cwilso: we should wrap up
01:30:06 [Travis]
RRSAgent: make minutes
01:30:06 [RRSAgent]
I have made the request to generate https://www.w3.org/2019/09/19-wicg-minutes.html Travis
01:30:17 [Travis]
thanks Zakim
01:31:38 [iclelland]
iclelland has joined #wicg
01:49:11 [iclelland]
iclelland has joined #wicg
02:02:43 [iclelland]
iclelland has joined #wicg
02:05:15 [aboxhall_]
aboxhall_ has joined #wicg
02:06:30 [ericc]
ericc has joined #wicg
02:23:47 [jihye]
jihye has joined #wicg
02:51:12 [Zakim]
Zakim has left #wicg
03:02:48 [ericc]
ericc has joined #wicg
03:15:46 [iclelland]
iclelland has joined #wicg
03:49:32 [iclelland]
iclelland has joined #wicg
03:52:00 [iclelland]
iclelland has joined #wicg
04:02:16 [iclelland]
iclelland has joined #wicg
04:13:53 [ericc]
ericc has joined #wicg
04:34:59 [ericc]
ericc has joined #wicg
05:34:24 [Guest19]
Guest19 has joined #wicg
05:45:43 [onix]
onix has joined #wicg
05:48:55 [tiger]
tiger has joined #wicg
06:06:03 [iclellan1]
iclellan1 has joined #wicg
06:07:04 [bkardell_]
bkardell_ has joined #wicg
06:28:19 [iclelland]
iclelland has joined #wicg
06:30:22 [tantek]
tantek has joined #wicg
06:37:18 [ericc]
ericc has joined #wicg
07:53:26 [iclelland]
iclelland has joined #wicg
07:57:43 [iclellan1]
iclellan1 has joined #wicg
08:21:47 [tantek_]
tantek_ has joined #wicg
08:28:34 [iclelland]
iclelland has joined #wicg
09:29:46 [Rossen]
Rossen has joined #wicg
09:31:17 [leaverou]
leaverou has joined #wicg
10:57:02 [Guest19]
Guest19 has joined #wicg
12:28:13 [iclelland]
iclelland has joined #wicg
13:40:57 [iclelland]
iclelland has joined #wicg
14:33:02 [iclelland]
iclelland has joined #wicg