IRC log of immersive-web on 2022-06-14

Timestamps are in UTC.

18:57:46 [RRSAgent]
RRSAgent has joined #immersive-web
18:57:46 [RRSAgent]
logging to https://www.w3.org/2022/06/14-immersive-web-irc
18:59:03 [Leonard]
Leonard has joined #Immersive-web
18:59:10 [Leonard]
present+
19:01:33 [bajones]
bajones has joined #Immersive-Web
19:01:38 [laford]
laford has joined #immersive-web
19:02:29 [ada]
present+
19:03:18 [Ren]
Ren has joined #immersive-web
19:03:23 [atsushi]
WBS of charter review > https://www.w3.org/2002/09/wbs/33280/iwwg-charter-2022/
19:03:49 [laford]
unable to join the webex meeting
19:03:55 [laford]
stuck on "connecting"
19:04:01 [cabanier]
present+
19:04:10 [ada]
oh that's odd
19:04:11 [atsushi]
rrsagent, make log public
19:04:39 [ada]
it seems to be working laford
19:04:52 [atsushi]
AC rep list > https://www.w3.org/Member/ACList
19:04:58 [atsushi]
rrsagent, publish minutes
19:04:58 [RRSAgent]
I have made the request to generate https://www.w3.org/2022/06/14-immersive-web-minutes.html atsushi
19:05:31 [atsushi]
s/ > / -> /
19:05:34 [atsushi]
rrsagent, publish minutes
19:05:34 [RRSAgent]
I have made the request to generate https://www.w3.org/2022/06/14-immersive-web-minutes.html atsushi
19:05:55 [atsushi]
s/ > / -> /g
19:06:12 [laford]
I'm in
19:06:14 [laford]
+present
19:07:23 [yonet]
present+
19:07:48 [Leonard]
scribe leonard
19:08:14 [yonet]
Agenda: /invite RRSAgent #immersive-web
19:08:29 [yonet]
Agenda: https://www.w3.org/events/meetings/e04319d4-6887-4325-b02c-c4eff5814093/20220614T120000
19:08:30 [Leonard]
Agenda at https://github.com/immersive-web/administrivia/blob/main/meetings/wg/2022-06-14-Immersive_Web_Working_Group_Teleconference-agenda.md
19:08:48 [Leonard]
https://github.com/immersive-web/webxr-hand-input/issues/117
19:09:14 [atsushi]
s|Agenda: /invite RRSAgent #immersive-web||
19:09:25 [atsushi]
s/agenda at/agenda:/
19:09:28 [Leonard]
Rik: Oculus browser stopped working when using hand inputs. Now hand inputs are on regardless or system gesture or not.
19:09:52 [atsushi]
i|https://github.com/immersive-web/webxr-hand-input/issues/117|topic: webxr-hand-input #117 Provide a signal that hands are doing a system gesture|
19:09:57 [atsushi]
rrsagent, publish minutes
19:09:57 [RRSAgent]
I have made the request to generate https://www.w3.org/2022/06/14-immersive-web-minutes.html atsushi
19:10:12 [Leonard]
... wants to provide signal when the system is processing system gestures and the application should not do hand movement recognition
19:10:17 [laford]
+q
19:10:43 [atsushi]
s/scribe /scribe: /
19:10:45 [laford]
c:
19:10:47 [atsushi]
s/+present/present+
19:10:49 [Leonard]
... Concerned about adding another event because of the large number of events
19:10:54 [atsushi]
rrsagent, publish minutes
19:10:54 [RRSAgent]
I have made the request to generate https://www.w3.org/2022/06/14-immersive-web-minutes.html atsushi
19:11:22 [ada]
q+ to say that gamepad is mistake
19:11:33 [ada]
ack laford
19:11:37 [atsushi]
s|s/agenda at/agenda:/||
19:11:53 [atsushi]
s/Agenda at /agenda: /
19:11:55 [atsushi]
rrsagent, publish minutes
19:11:55 [RRSAgent]
I have made the request to generate https://www.w3.org/2022/06/14-immersive-web-minutes.html atsushi
19:11:57 [ada]
q?
19:11:59 [ada]
ack ada
19:11:59 [Zakim]
ada, you wanted to say that gamepad is mistake
19:12:49 [laford]
+q
19:12:51 [cabanier]
q+
19:12:54 [ada]
ack laford
19:13:13 [Leonard]
Lachlan: Would like more clarity on why events are bad
19:13:37 [bajones]
q+
19:13:42 [Leonard]
Rik: There are a lot of authoring issues with events. Mostly it seems that authors are not handling them correctly.
19:13:46 [Leonard]
q+
19:14:36 [ada]
q+ to concede
19:16:58 [ada]
Zakim, choose a victim
19:16:58 [Zakim]
Not knowing who is chairing or who scribed recently, I propose yonet
19:17:09 [cabanier]
flag: XR_HAND_TRACKING_AIM_SYSTEM_GESTURE_BIT_FB
19:17:43 [ada]
zakim, present
19:17:43 [Zakim]
I don't understand 'present', ada
19:17:59 [ada]
who's here?
19:18:02 [ada]
zakim, who's here?
19:18:02 [Zakim]
Present: Leonard, ada, cabanier, present, yonet
19:18:04 [Zakim]
On IRC I see Ren, laford, bajones, Leonard, RRSAgent, Zakim, yonet, atsushi, ada, cwilso, iank_, sangwhan, cabanier, NellWaliczek, bemfmmhhj, SergeyRubanov, babage, etropea73101,
19:18:04 [Zakim]
... [old]freshgumbubbles, rzr, OverTime, Chrysippus, dietrich, fernansd, `join_subline, Manishearth
19:18:20 [ada]
q?
19:18:31 [Leonard]
q-
19:18:33 [ada]
ack cabanier
19:18:36 [ada]
ack bajones
19:18:48 [Leonard]
Leonard has left #immersive-web
19:18:50 [bajones]
Hold on, tech difficulties
19:19:13 [cabanier]
q+
19:19:43 [yonet]
brandon: generally I think it is little difficult to look at some of the events and make apple to apple comparison
19:20:16 [yonet]
It is something doesn't come up often that people are motivated. I'm hesitant to make comparison.
19:20:31 [yonet]
brandon: I want to make sure it is not easy to accidently ignore
19:21:06 [yonet]
brandon: we do have the target ray mode in input sources, a lot of apps check
19:22:05 [yonet]
brandon: There are a lot of ifs, with a the content out there, it is hard to say, this won't break.
19:22:22 [yonet]
Both flag and event will be both ignorable by the developers
19:23:03 [yonet]
ada: my experience dealing with hand input, it seems frequently, the only time devs check target ray mode is when the app starts.
19:23:31 [laford_]
laford_ has joined #immersive-web
19:23:31 [yonet]
ada: if it is not tracking pointer, it is gaze and you go a different path
19:24:07 [yonet]
ada: they have to be significantly refactor to account for target ray mode is can change anytime
19:24:51 [yonet]
brandon: It would not surprise me if the frameworks are checking ray target once.
19:25:20 [bajones]
q+
19:25:30 [yonet]
ada: From the pure developer point, flag would be simpler.
19:25:33 [ada]
ack ada
19:25:33 [Zakim]
ada, you wanted to concede
19:25:36 [ada]
ack cabanier
19:25:59 [yonet]
cabanier: I proposed this in the context of hands but maybe we should have flag for all of the input sources
19:26:22 [yonet]
ada: if the user is doing something for the screen input
19:27:36 [ada]
ack bajones
19:27:38 [ada]
q?
19:27:43 [yonet]
brandon: so much of our devices are gestured input right now. I think the native input can turn into a back gesture. I need to look.
19:28:27 [yonet]
brandon: the thing I want to point, looking at the properties in the input source now, handedness, target ray mode, the profiles and spaces.
19:28:35 [yonet]
technically those are all immutable
19:28:56 [yonet]
While the position of the space is immutable and you query the spaces everytime.
19:29:40 [yonet]
If any change, it is removal and re-enter. It is nice to say everything is immutable. That makes me to lean towards event
19:29:51 [cabanier]
q+
19:29:55 [ada]
ack cabanier
19:30:02 [yonet]
brandon: maybe it is clearer if its an event
19:30:21 [yonet]
rik: it would not be an attribute, but a function instead
19:30:37 [yonet]
brandon: I would feel better about that than straight attribute
19:30:52 [yonet]
rik: you are right, everything else is not immutable.
19:31:01 [yonet]
q?4
19:31:34 [yonet]
brandon: spitballing on this. Let's ignore if we have a gesture and ignore if it is event or flag
19:32:01 [yonet]
Target ray space is there but you might not be able to get it at some point.
19:32:31 [yonet]
brandon: if we stop giving out target ray point when there is a gesture, how much we would be breaking things
19:33:20 [yonet]
if you are using a hand, I'm interacting with buttons. If you take away target ray space and the ray space dissapear. Any interactions based on that will dissapear naturally
19:34:00 [yonet]
The failure mode that comes to mind, if they have some logic, first check target ray and if it is not there, there is no hand. I haven't seen a logic like that.
19:34:16 [yonet]
I don't think we should not worry too much about that case
19:34:18 [yonet]
q?
19:34:32 [yonet]
rik: I guess that's something we should try out first
19:34:42 [yonet]
brandon: getspace can always return null
19:35:23 [yonet]
rik: if you can track the controller or hands we can assume we have target ray space
19:36:06 [yonet]
brandon: I still think you could do a blur without this. If we think this is the safe thing to do, people might pay more attention.
19:36:50 [yonet]
Maybe we can say, we wouldn't be breaking certain percentage of the apps but I am not sure if we can say that
19:37:05 [yonet]
Rik: we might be able to gather some data on this.4
19:37:46 [yonet]
rik: recap, we should always have a method that says (tbd). We need to experiment when you are doing a gesture we don't have target ray
19:37:47 [ada]
q?
19:38:40 [yonet]
rik: next topic, a library called doVR?
19:39:01 [yonet]
Right now they use layers, you can't calculate their opacity.
19:39:29 [ada]
q?
19:39:29 [bajones]
q+
19:39:45 [yonet]
rik: it is not surfaced in webxr layer.
19:39:58 [yonet]
ada: what is the advantage of having opacity?
19:40:43 [yonet]
rik: in the media layer there is no control. If you render it to a webgl layer, right now you render everything to a shader
19:40:55 [ada]
ack bajones
19:40:56 [yonet]
ada: I guess it is for debugging too.
19:41:57 [yonet]
brandon: There are multiple ways that alpha blending can happen, is that something that is exposed to openXR and what are the options
19:42:10 [yonet]
rik: there are multiple options.
19:42:28 [yonet]
rik:composition layer color scale bias
19:42:33 [ada]
q+ to ask about mixBlendMode
19:42:41 [yonet]
is an extention
19:43:05 [yonet]
brandon: this is used not only for alpha but also for a tint
19:43:32 [yonet]
rik: also have a composition layer alpha blend, that lets you control how the background is treated.
19:43:50 [yonet]
rik: currently webxr offers only source over blending
19:44:06 [ada]
q?
19:44:18 [yonet]
brandon: if I draw partially transparent pixels, it will only render alpha
19:44:21 [ada]
ack ada
19:44:21 [Zakim]
ada, you wanted to ask about mixBlendMode
19:44:30 [ada]
q+
19:44:56 [ada]
ack ada
19:45:07 [yonet]
brandon: I don't have a particular problem with giving alpha. I just don't want to get overly complex with different layer blending options
19:45:25 [ada]
https://developer.mozilla.org/en-US/docs/Web/CSS/mix-blend-mode#syntax
19:45:32 [yonet]
ada: it would be nice if you get the same blend modes of css's blend modes
19:45:37 [yonet]
rik: that we can not do.
19:46:12 [yonet]
brandon: a lot of this mixed blend modes can be represented by openGL and web gpu
19:46:29 [yonet]
I think it is more likely that we take WebBL or what web gpu does
19:46:56 [yonet]
you are describing how the values are slotted than these human readable values
19:47:36 [yonet]
ada: not all of these are going to be based on OpenXr. If we can have a some kind of blend mode, we could have a unified way for web
19:47:56 [yonet]
rik: it can be done.
19:48:17 [yonet]
ada: if we have a cloud layer and have fog with additive mode, that would be nice
19:48:20 [ada]
q?
19:48:37 [yonet]
It would be cool to have more of these modes.
19:48:59 [yonet]
ada: WebGL has a list of modes for when it composites
19:49:11 [yonet]
rik: meta is the only one that supports that.
19:49:35 [yonet]
ada: if you were to use it, it will work but is not on the webxr spec
19:49:51 [yonet]
rik: it is not on webxr spec but it is on openXR spec
19:50:03 [bajones]
+1
19:50:06 [ada]
+1
19:50:07 [yonet]
rik: everybody OK with opacity?
19:50:07 [cabanier]
+1
19:50:10 [yonet]
+1
19:50:45 [yonet]
ada: go ahead and update the spec
19:51:04 [Ren]
+1
19:51:25 [yonet]
Do we have anything else we want to bring up
19:51:43 [yonet]
Sorry Ren, I've tried to get Ada's attention but I was muted I guess
19:52:13 [ada]
RRSAgent, draft minutes
19:52:13 [RRSAgent]
I have made the request to generate https://www.w3.org/2022/06/14-immersive-web-minutes.html ada
19:52:42 [yonet]
RRSAgent, make minutes
19:52:42 [RRSAgent]
I have made the request to generate https://www.w3.org/2022/06/14-immersive-web-minutes.html yonet
19:53:18 [yonet]
Ada, I think I didn't change the scribe.
19:53:28 [ada]
oh no
19:53:35 [yonet]
Do you the minutes doesn't have mine
19:53:39 [atsushi]
i/flag: XR_HAND_TRACKING_AIM/scribe+ yonet/
19:53:40 [atsushi]
rrsagent, publish minutes
19:53:40 [RRSAgent]
I have made the request to generate https://www.w3.org/2022/06/14-immersive-web-minutes.html atsushi
19:53:44 [yonet]
OK. I'll look into it.
19:53:54 [ada]
thanks for updating that atsushi
19:53:58 [yonet]
and just type it in
19:56:19 [atsushi]
i/next topic, a library called doVR?/topic: layers#283 Alpha property for XRCompositionLayer/
19:56:20 [atsushi]
rrsagent, publish minutes
19:56:20 [RRSAgent]
I have made the request to generate https://www.w3.org/2022/06/14-immersive-web-minutes.html atsushi
19:57:06 [atsushi]
believe, this should be fine ;)
19:58:10 [yonet]
Thanks Atsushi
19:58:22 [yonet]
I've copied the text here if needed.
19:59:37 [atsushi]
quick guide is here -> https://w3c.github.io/scribe2/scribedoc.html
22:27:57 [Zakim]
Zakim has left #immersive-web