19:56:50 RRSAgent has joined #immersive-web 19:56:50 logging to https://www.w3.org/2019/12/03-immersive-web-irc 19:56:56 zakim, clear agenda 19:56:56 agenda cleared 19:57:54 dom has joined #immersive-web 19:58:13 agenda+ webxr-gamepads-module#23 Should hand-based input sources have a Gamepad? requested by Manishearth to be discussed alongside https://github.com/immersive-web/webxr-input-profiles/issues/105 19:58:13 agenda+ webxr-input-profiles#105 Should we have a generic profile for hands? requested by Manishearth to see what people think about a generic hands profile 19:58:13 agenda+ webxr#900 Use permissions API for XR feature requests requested by Manishearth to ask people to look at it 19:58:14 agenda+ webxr#929 Split input sources into primary/auxiliary - klausw 19:59:10 zakim, this conference is Immersive Web WG 19:59:10 got it, atsushi 19:59:19 rrsagent, make log public 19:59:27 chair: Ada, cwilso 19:59:57 agenda: https://github.com/immersive-web/administrivia/blob/master/meetings/wg/2019-12-03-Immersive_Web_Working_Group_Teleconference-agenda.md 20:00:11 Meeting: Immersive Web WG 20:00:48 Previous meeting: https://www.w3.org/2019/10/22-immersive-web-minutes.html 20:00:58 joshmarinacci has joined #immersive-web 20:01:01 rrsagent, draft minutes v2 20:01:01 I have made the request to generate https://www.w3.org/2019/12/03-immersive-web-minutes.html atsushi 20:01:06 Hiya everyone! 20:01:07 present+ 20:01:40 good morning! 20:01:58 present+ 20:01:59 bajones has joined #Immersive-Web 20:02:40 present+ 20:03:02 Scribe: trevorfsmith 20:03:44 klausw has joined #immersive-web 20:03:50 dino has joined #immersive-web 20:04:18 leonard_ has joined #immersive-web 20:04:25 present+ 20:04:41 present+ 20:04:42 Agenda: https://github.com/immersive-web/administrivia/blob/master/meetings/wg/2019-12-03-Immersive_Web_Working_Group_Teleconference-agenda.md 20:04:46 present+ 20:05:12 Present+ 20:05:18 alcooper has joined #immersive-web 20:05:28 bialpio has joined #immersive-web 20:06:06 idris_ has joined #immersive-web 20:06:32 https://github.com/immersive-web/webxr-gamepads-module/issues/23 20:06:34 https://github.com/immersive-web/webxr-input-profiles/issues/105 20:06:35 Ada: Welcome! At the bottom of the agenda email there is a list of issues that are in the pre-CR issues and are the last blockers before CR. If anyone would take up an issue that would be a big help toward the goal of CR. 20:06:47 Rafael has joined #immersive-web 20:06:52 zakim, take up agendum 1 20:06:52 agendum 1. "webxr-gamepads-module#23 Should hand-based input sources have a Gamepad? requested by Manishearth to be discussed alongside 20:06:55 ... https://github.com/immersive-web/webxr-input-profiles/issues/105" taken up [from atsushi] 20:07:02 Ada: Starting agenda item about whether hands should be presented with a gamepad. 20:08:00 Artem has joined #immersive-web 20:08:04 Manish: This isn't about exposing articulated hand support, that will be a separate spec with a hand object or similar. The current question is on the hololens et al you can click and grab and they should trigger select and squeeze events. The question is whether non-physical objects should have a gamepad object with input mapping. 20:09:06 q+ 20:09:10 Manish: One argument that the gamepad is supposed to be for physical gamepads. Another is that it's nice for one API to expose these very similar events. Hands will definitely trigger grap/squeeze etc but should they have a gamepad object? Also, should there be a generic hand profile? 20:09:26 ack bajones 20:10:05 NellWaliczek has joined #immersive-web 20:10:14 present+ 20:10:15 present+ 20:10:49 alexturn has joined #immersive-web 20:10:50 q? 20:10:52 Brandon: One thing I wanted to point out is that this has ramifications beyond hands. Chrome currently exposes cardboard style inputs without a gamepad, it only is an input source with select events. If we say that the hands should have gamepads then it probably makes sense for cardboard, as well, despite it being captures by the single event. That's not my preference but I understand why it could make sense. Or, if we want to carve out 20:10:52 hands as special. 20:10:53 present+ 20:11:00 q+ 20:11:03 q+ 20:11:09 ack alex 20:12:41 Alexturn: Part of having select and squeeze events is to create a viable path without gamepads. I'm more comfortable with cardboard not having a gamepad because more middleware will handle the non-gamepad case. I was worried that hands would be the only exception and so unsuported so many pages wouldn't know how to handle inputs without gamepads. I want to avoid hands emulating the gamepad because all of the pages assume their existance. 20:13:03 q? 20:13:14 q+ 20:13:24 Brandon: The other thing that the events exist for is the user activation event. So, playing back media you're going to want to use those events and not just gamepad events. 20:13:31 ack Artem 20:13:50 q+ 20:14:38 Artem: I don't see anything wrong with exposing gamepads on hand objects however we're more interested in full hands API as Manish mentioned. Oculus is introducing hands tracking for Quest so without hands representation it's super confusing. You can do gestures but you don't see your hands. I'm afraid of forgetting about the full hands API because this support (hands with gamepads) exist. 20:14:39 ack NellWaliczek 20:15:04 q+ 20:15:56 q+ to talk about semantic vs. physical hand mappings 20:16:34 Nell: The main question I have is that currently generic input profiles (not specific to Quest, hololens, etc) at the minimum if we expose the gamepad object then what is the minimum conformance for the gamepad object? Is it trigger, touchpad, etc... Are folks suggesting a similar namespace for hands? Or, are they expecting hands having input at gamepad.0 or 1. What does conformance for hands' gamepad objects look like? If I can't know 20:16:34 from the profile, it's not a good situation. 20:16:49 ack alexturn 20:16:49 alexturn, you wanted to talk about semantic vs. physical hand mappings 20:18:22 Alexturn: I can speak to what OpenXR and by extension WebXR. We're exposing physical hands with select and squeeze actions. You can have a select and squeeze path with names that overlap with controllers. So, for WebXR where we put slot 0 and 1 with select and squeeze events then they'd map to select and squeeze to a hand. So, conformance to "generic hand" would be select and squeeze in the same slots on the gamepad. Separately you could 20:18:22 use articulated hand joints and render a hand. 20:18:38 q? 20:18:52 Ravi has joined #Immersive-web 20:18:54 Nell: Are those expected to be analogue data? So gamepad.0 would be interpolated between 0 and 1? 20:19:01 Alexturn: They would be floats. 20:19:13 Nell: Artem, does that align with your expectations? 20:19:25 Artem: I think it will be 0 or 1. 20:19:38 NellWaliczek: The trigger and squeeze are the two you could support? 20:19:57 Artem: One is select and one is "exit from VR" so I'd need to check on other gestures. 20:20:20 NellWaliczek: So it may be that generic hand can't assume the squeeze so what does that imply? Different profiles? 20:20:52 alexturn: I would imagine more common would be grab-and-move so it would be good ot know beforehand to know whether platforms will support it. 20:21:33 NellWaliczek: So maybe we need several generic profiles for hands. Could you put comments in the relevant issue so we can get spec text and the registry updated? Artem, could you make a PR with your changes? 20:21:46 ack Manishearth 20:21:51 https://github.com/immersive-web/proposals/issues/48 20:21:55 q- 20:22:04 Manishearth: I wanted to mention that there's a proposals issues for articulated hands. If you're interested please comment there. 20:22:09 q- 20:22:09 hi, Ravi from Magicleap, we would like to get back on GitHub with respect to generic events for hand gesture 20:22:11 (linked above) 20:22:21 Thanks, Ravi! 20:22:56 Alexturn: Artem, please follow up with Oculus hand tracking forks to know whether squeeze will be supported. 20:23:17 q+ 20:23:20 Brandon: I thought airtap and bloom are the two Hololens gestures. 20:23:29 ack NellWaliczek 20:23:33 alexturn: We'll support WebXR so additional gestures are possible. 20:23:48 NellWaliczek: Are we aware of devices with hand support but no controller support? 20:23:59 alexturn: It could be true for Hololens. 20:24:11 NellWaliczek: I wanted to know whether there would be no controller fallback. 20:24:27 alexturn: Yes, on Hololens. 20:24:42 https://github.com/immersive-web/webxr/pull/900 20:25:00 zakim, take up agendum 3 20:25:00 agendum 3. "webxr#900 Use permissions API for XR feature requests requested by Manishearth to ask people to look at it" taken up [from atsushi] 20:25:01 Next agenda item: User permissions API for feature requests 20:25:36 Manish: This PR uses the permissions API underneath so it simplifies logic and lets us use the features the platform exposes. 20:25:40 q+ 20:25:47 ack NellWaliczek 20:26:06 q+ 20:26:12 NellWaliczek: A quick question for Mounir. What does "running out of time" in your review comments. 20:26:22 mounir: I was just saying that I had to leave that day. 20:26:45 ack bajones 20:26:47 NellWaliczek: Ok, great that it's not more existential. 20:27:54 Brandon: Currently we return the granted features and I know that's something we've asked about in the past. What is the use case for that. Does anyone on the call looking for a way to query for that? 20:28:02 q+ 20:28:18 Manish: We had some issues with concrete uses. I think #722. 20:28:25 ack NellWaliczek 20:28:48 NellWaliczek: You want to know whether there are developer use cases for querying a current session for its granted capabilities? 20:29:03 Brandon: Yes, if I ask for this in an upcoming permission am I going to get a prompt? 20:29:45 NellWaliczek: A Sumerian customer may want to put up a more contextually relevant message about the feature that was declined but I don't think we can check whether it's declined or just not supported. 20:30:12 NellWaliczek: I need a way I can detect it so I can tell the user that it's not available. 20:31:10 Brandon: Looking at current features (reference spaces) there's an alternate way to determine whether it's supported. It's not clear whether that's true for future features. It would be potentially useful to differentiate between denied and not-available. 20:31:46 NellWaliczek: It's possible that future APIs will return different values for refusal and never-available so our customers will want to know that. 20:31:47 q+ 20:31:53 Brandon: Thank you, that adds context. 20:31:57 ack mounir 20:32:40 mounir: We are exposing granted features. If it's denied we won't tell you what was denied. I wonder if that will be solved by repeated requests to determine which ones are denied. 20:33:35 mounir: It's not clear that browsers would make that granularity available to users. It feels like we have a feature from the device which isn't the case. Asking for permission status doesn't help at the moment. I expect every browser to return everything. 20:34:05 NellWaliczek: Could you clarify? I understand asking for each feature to determine whether it's available. 20:34:36 mounir: The permission API won't expose whether the device exposes the feature. It exposes whether the UA allows the feature. 20:35:25 q+ to talk about eye tracking as such a feature 20:35:46 NellWaliczek: I don't care whether it was declined or not possible. With the features we have today you can request them and then check if they are available. Perhaps we need to put guidelines in the spec that features that are required or optional must have a way to query whether it is available. 20:36:23 mounir: Yes, that's important to features. We should not have request session shaped differently just for that. That should be a separate API from permissions. 20:36:36 NellWaliczek: Let's be sure that guidance is in the spec text for future features. 20:36:46 mounir: Sure, that is in general a web principle. 20:36:52 ack alexturn 20:36:52 alexturn, you wanted to talk about eye tracking as such a feature 20:37:42 alexturn: I wanted to mention eye tracking as a future feature with above average privacy impact. You may then want to be in an env with the headset but then enable eye tracking. If the app wants to enable it then it needs to explain beforehand why it will request the permission. 20:38:17 alexturn: So, apps will want to know whether it has been / will be refused or whether it's not ever possible. 20:38:42 NellWaliczek: We didn't spec how the permission prompt would be displayed, so it could be in HMD or on a separate screen, for example. 20:39:02 alexturn: In addition to upgradability there would be this addition query to know how ot manage the request. 20:39:13 https://github.com/immersive-web/webxr/pull/929 20:39:14 zakim, take up agendum 4 20:39:14 agendum 4. "webxr#929 Split input sources into primary/auxiliary - klausw" taken up [from atsushi] 20:39:40 q+ 20:40:37 klausw: This may be a short conversation. The way the spec was written it says the input source needs a primary action which seems over-prescribed. Vive trackers may have input-less auxiliaries. A separate topic would be touches on an input device then they may or may not count as primary inputs and may or may not generate mouse events. 20:40:45 ack NellWaliczek 20:40:51 klausw: So, the spec is currently more specific than it needed to be. 20:41:21 q+ 20:42:20 NellWaliczek: The background is that the things noted are intentional but perhaps not right. When we looked at it we excluded things like the vive trackers because we didn't want over-broad category. So, Vive trackers could be a space not an input source. Given that we don't do that yet, what does it mean for other sorts of tracked spaces and consistency for how they're reported? A future tracker like an image tracker would fall into a 20:42:20 different category. 20:43:00 NellWaliczek: So, we wanted ot exclude things without a primary input source. We received feedback that things like screen touches should have primary and secondary inputs. 20:43:16 NellWaliczek: I'll review this, thanks. 20:43:18 ack bajones 20:44:13 q+ to ask about the profile id 20:44:41 bajones: I remember Nell's recounting of the history the same way. That may not have been the right decision, we have more context around the decision now for things like finger tracking and screen-based sessions. The Vive tracker is useful but a weird case. The trackers are used as alternative inputs, either by strapping them to their joints for skeleton tracking or they'll put an AR-15 controller with a tracking puck on top. 20:45:51 ack NellWaliczek 20:45:51 NellWaliczek, you wanted to ask about the profile id 20:45:53 bajones: There's really no way to differentiate between the two uses (skeleton / controller) as Vive assumes a bespoke app using the trackers that makes its own assumptions. Looking at touchscreens maybe does point toward separating it out. Maybe if the device has at least primary input then it can be an input device. 20:46:03 am i frozen? 20:46:37 NellWaliczek: The one question about Vive trackers would be what the profile ID would be. We don't have a profile ID for generic "no data given". What do people think it should report? 20:46:52 bajones: The Vive puck is capable of reporting All The Buttons or none and we can't know. 20:47:01 q? 20:47:14 NellWaliczek: If we report the pucks we need a sense of conformance but I don't know what that would be. 20:47:32 q+ To mention that one use case could be for accessibility 20:48:08 klausw: I don't have an urgent use case for pucks. This was motivated by the finger side of things. Having alternate channels other than input sources would work but we don't have that now. Maybe we don't need to solve it now but I wanted to check whether we purposefully made it so specific. 20:48:19 q+ 20:48:29 ack kip 20:48:29 kip, you wanted to mention that one use case could be for accessibility 20:49:18 kip: A quick sidenote: Two use cases are for accessibility for input for alternate means like a puffing (blowing into a tube) or something without gestures but allows visualization of something in space so the user can safely stand in the space. 20:49:25 ack bajones 20:49:38 bajones: That use case is awesome. 🎉 20:50:06 Ravi has joined #Immersive-web 20:50:22 q+ to ask what openxr is using for profile ids? 20:50:32 bajones: In terms of trackable objects that aren't inputs, most of the native APIs that have that they just stuff them into the input APIs anyway. A vive puck would show up in the OpenXR input system. We might want to make a distinction I suspect we'll be fighting the native systems a bit. 20:50:37 ack NellWaliczek 20:50:37 NellWaliczek, you wanted to ask what openxr is using for profile ids? 20:50:57 NellWaliczek: There is a question about OpenXR that maybe alex could answer. What are the pucks reported as in OpenXR? 20:51:15 q? 20:51:15 Alex: That hasn't been solved. 20:51:49 NellWaliczek: Maybe we shouldn't address it yet, too. Klaus, maybe the spec shouldn't address a puck (or device without primary action) so we're not blocking the changes in flight. 20:52:07 bajones: We can spec out with multi-touch without endorsing one way or another additional tracked objects. 20:52:49 Ada: Think we're near the end of the call. One last minute news item: please look at the help-wanted issues listed in the email, particularly in the pre-cr milestone. 20:53:54 Ada: The other thing I wanted to note is that in February we're thinking about another face-to-face, probably in Seattle and probably on Feb 12th & 13th. I'll send out more information when we have a venue. If there is a Seattle-area person who could volunteer a venue please contact cwilso or I. 20:54:01 Ada: Any other topics? 20:54:31 Meeting complete! 20:54:42 puck 20:55:07 rrsagent, draft minutes v2 20:55:07 I have made the request to generate https://www.w3.org/2019/12/03-immersive-web-minutes.html atsushi 20:56:36 (disregard, incorrect window focus while typing...) 21:57:51 ravi has joined #immersive-web 22:01:49 Is this the issue w/ generic hand tracking inputs ? https://github.com/immersive-web/webxr-input-profiles/issues/105 23:13:15 atsushi has joined #immersive-web 23:18:05 Manishearth has joined #immersive-web 23:18:09 Manishearth has left #immersive-web 23:27:40 Manishearth has joined #immersive-web 23:35:24 Manishearth has joined #immersive-web 23:58:48 bialpio has joined #immersive-web