18:00:21 RRSAgent has joined #immersive-web 18:00:21 logging to https://www.w3.org/2020/06/16-immersive-web-irc 18:01:01 Zakim has joined #immersive-web 18:18:27 present+ 18:19:06 chair: cwilson 18:20:30 Agenda: https://github.com/immersive-web/administrivia/blob/master/meetings/cg/2020-06-23-Immersive_Web_Community_Group_Teleconference-agenda.md 18:21:33 Date: 16 June 2020 18:26:31 Zakim, this is immersive-web 18:26:31 got it, yonet 18:53:53 meeting: Immersive Web Community Group 18:54:00 rrsagent, make log public 18:54:09 rrsagent, publish minutes v2 18:54:09 I have made the request to generate https://www.w3.org/2020/06/16-immersive-web-minutes.html atsushi 18:55:49 previous meeting: https://www.w3.org/2020/05-19-immersive-web-minutes.html 18:57:22 zakim, list agenda 18:57:22 I see nothing on the agenda 18:57:47 agenda+ layers#154 first pass at hit testing; discuss the current hit test proposal [cabanier] 18:58:14 agenda+ layers#158 Layers on devices with views of different resolutions; how should we handle views from different devices (ie eye displays + observer) [cabanier] 18:58:32 Leonard has joined #immersive-web 18:58:36 agenda+ layers#163 Should non-projection layers support more than 2 views?; Should non-projection layers support more than 2 views? [cabanier] 18:58:38 zakim, list agenda 18:58:38 I see 3 items remaining on the agenda: 18:58:39 1. layers#154 first pass at hit testing; discuss the current hit test proposal [from cabanier via atsushi] 18:58:39 2. layers#158 Layers on devices with views of different resolutions; how should we handle views from different devices (ie eye displays + observer) [from cabanier via atsushi] 18:58:39 3. layers#163 Should non-projection layers support more than 2 views?; Should non-projection layers support more than 2 views? [from cabanier via atsushi] 18:59:16 kip has joined #immersive-web 18:59:26 present+ 19:00:05 bajones has joined #Immersive-Web 19:00:27 present+ 19:01:55 present+ 19:02:15 take up agendum 3 19:02:27 cabanier_ has joined #immersive-web 19:03:22 zakim, choose a victim 19:03:22 Not knowing who is chairing or who scribed recently, I propose cwilso 19:03:29 zakim, choose a victim 19:03:29 Not knowing who is chairing or who scribed recently, I propose Leonard 19:03:39 present+ 19:03:43 zakim, choose a victim 19:03:43 Not knowing who is chairing or who scribed recently, I propose Leonard 19:03:45 present+ 19:03:49 zakim, choose a victim 19:03:49 Not knowing who is chairing or who scribed recently, I propose cwilso 19:03:55 zakim, who is here? 19:03:55 Present: yonet, Leonard, kip, cwilso, cabanier_, bajones 19:03:57 On IRC I see cabanier_, bajones, kip, Leonard, Zakim, RRSAgent, yonet, Karen, atsushi, sangwhan_, trevorfsmith, sheppy, surma, Manishearth, mounir, ada, flaki, bertf, garykac, 19:03:57 ... iank_, NellWaliczek, cwilso 19:04:04 zakim, choose a victim 19:04:04 Not knowing who is chairing or who scribed recently, I propose kip 19:04:11 present+ 19:04:13 present+ 19:04:17 present+ 19:04:36 scribenick: manishearth 19:04:54 Zakim, take up agendum 1 19:04:54 agendum 1. "layers#154 first pass at hit testing; discuss the current hit test proposal" taken up [from cabanier via atsushi] 19:05:09 https://github.com/immersive-web/layers/pull/154 19:05:09 https://github.com/immersive-web/layers/issues/163 19:05:27 q? 19:05:28 cabanier: i've created a strawman proposal for hit testing in layers 19:05:30 q+ 19:05:44 q+ 19:05:46 cabanier: which describes what layer was hit, and where on the layer it was hit, along with where in 3d space it was hit 19:05:49 q- 19:05:50 q+ 19:06:06 cabanier: also extended xrframe so that you can pass a space+array and it returns a list of those dictionaries 19:06:35 cabanier: since everything is known there it can be synchronous 19:06:58 bialpio has joined #immersive-web 19:07:13 alexturn has joined #immersive-web 19:07:14 ack bajones 19:07:17 cabanier: manish suggestd an event based api but the issue is that the event may fire too late 19:07:50 bajones: we should note that we have skipped a bit ahead in the agenda. rik is discussing the second agenda item 19:08:05 RafaelCintron has joined #immersive-web 19:08:06 bajones: for hit testng itself it's tempting to find parallels with world geometry 19:08:07 [Just saw that Zakim chose me while reconnecting... 19:08:10 Will scribe... 19:08:14 ... but different in that the client side knows all the info in this case 19:08:22 brandon: You can probably successfully offload this to a library 19:08:31 zakim, take up agendum2 19:08:31 agendum 2. "layers#158 Layers on devices with views of different resolutions; how should we handle views from different devices (ie eye displays + observer)" taken up [from 19:08:34 ... cabanier via atsushi] 19:08:36 zakim, take up agendum 3 19:08:36 agendum 3. "layers#163 Should non-projection layers support more than 2 views?; Should non-projection layers support more than 2 views?" taken up [from cabanier via atsushi] 19:08:40 zakim, take up agendum 1 19:08:40 agendum 1. "layers#154 first pass at hit testing; discuss the current hit test proposal" taken up [from cabanier via atsushi] 19:08:49 ... because as long as we do a decent job of defining what shape the parameters are in, gives out, the math behind this is a bit of a pain, but is not anything that can't be solved in JS 19:09:00 ... Not certain that is the right way to go, off to a library, but is a valid option for this case 19:09:00 bajones: the math for this is probably something that you can maybe offload to JS. but it might be a perfectly valid option 19:09:14 bajones: by the same token we don't need a big long subscription event based thing when we're talking about this 19:09:23 bajones: because everything can be figured out on the client side 19:09:36 ... so not enturely sue if the event is necessary here 19:09:47 Thanks for scribing Manishearth, I'll take over during your talk 19:09:49 ... unless there was some kind of DOM input on the layer and there was a security reason 19:10:11 bajones: final thought. i see a formulation of "find hit test", which will loop through all layers 19:10:39 ... i wonder if given the way you could use this, and given that everything is client side, we should make this a property of the layer itself, "for this frame this space/etc, where did i hit?" 19:10:50 ... then we don't need to worry about sequences etc 19:10:51 q+ 19:11:12 bajones: not aware of any native API equivalents to this that might dictate this particular design 19:11:17 ... something i'm missing here, rik? 19:11:26 cabanier: not that i think so. this won't be done in the native api 19:11:33 ack manish 19:11:39 manishearth: Clarifications 19:11:47 ... Not suggesting a subscription events based model 19:11:54 ... HAve preferred pointing ray field in on XRInputSources 19:12:00 ... Used to trigger events, similar to select events 19:12:03 ... Come with their own frame 19:12:07 ... Can do most operations you want to 19:12:19 ... Idea is to have a preferred pointing ray on XRInputSource anyways 19:12:27 ... For example, application is doing offset stuff 19:12:34 ... Not using target ray space, but using offset 19:12:38 ... Useful for DOM Overlay 19:12:45 ... Click events based on offset ray rather than target ray space 19:12:57 ... WebXR spec does not specify pointing ray 19:13:02 ... Having field is useful. Can have event api 19:13:07 ... Subscription based model seems heavy 19:13:16 ... For utility, per frame thing, here's space and ray... 19:13:23 ... Give me intersection in this space. That would work 19:13:34 ... Event thing would be, due to needing for DOM Overlay anyways 19:13:38 ... Unifying would be nice 19:13:44 ack cab 19:13:48 Rik: Makes a lot of sense to have hit test on layer 19:13:52 ... Could be multiple layers 19:14:06 q+ 19:14:07 ... Equirect layer wouldn't want to hit. Just quads and cylinders 19:14:12 ... Could be fine with that change 19:14:21 ... Originally proposed in library. At time people objected 19:14:21 q+ 19:14:26 ... Library might be the way to go 19:14:29 q+ 19:14:32 ... Didn't want to revisit 19:14:36 present+ 19:14:42 ack bajones 19:14:43 ... Could be that some UA's might place the layers slightly differently. Would all fall done 19:14:51 bajones: Coulple of notes based on what said 19:15:02 ... First, Rik saying that different UA's place layers differently 19:15:11 ... Would be concerned if the case, could happen, but ideally shouldn't 19:15:23 ... Good way to get bugs. Might see seams of black edges 19:15:47 ... In this case (Manish's), there may be intersection point other than pointing ray 19:15:50 ... Use for basic physics 19:16:06 ... Someone could do multiuser environment. Raytrace their pointer to surface, from network 19:16:14 ... detaching from input sources is decent idea 19:16:23 ... To Rik's idea on library. Given that it can be library. 19:16:28 ... Layering module already complicated 19:16:46 ... Would not object personally.. Good idea, but maybe for backburner. See what kind of feedback once layers out in the open 19:16:57 q+ 19:17:00 ... Take time to build up intersection library ourselves as we need to anyways. See what works for people 19:17:05 ... Additive layer on, 19:17:12 ... Will have better idea later 19:17:14 ack next 19:17:20 Manish: Want to mention that would prefer a library 19:17:23 ack manish 19:17:36 ... Pushing for discussion to happen as DOM overlay gives event.. Screen space... Multiple things giving X,Y coordinates 19:17:43 ... Realized that this may be potential option 19:17:53 ack alex 19:17:55 ... Potential of unification interests me. If not doing that, then library is fine 19:18:04 alex: Chime in for library 19:18:17 ... Something inside the UA feels like opportunity to address subtle differences 19:18:27 ... Apps will generally need to raycast against layers and scene geometry 19:18:32 ... Will need to do both predictably 19:18:41 ... Expect them to raycast against scene representation 19:18:57 ack cab 19:18:58 ... Just happen to use layers to render part of the scene, but would need to do that in their engine and have predictable outcome 19:19:08 Rik: Getting started on polyfill for layers 19:19:22 ... One of first things would be to make hit testing part of that polyfill 19:19:27 ... If others interested, can work with them 19:19:31 ... Otherwise will work on this 19:19:46 zakim, close agendum 1 19:19:46 agendum 1, layers#154 first pass at hit testing; discuss the current hit test proposal, closed 19:19:48 I see 2 items remaining on the agenda; the next one is 19:19:48 2. layers#158 Layers on devices with views of different resolutions; how should we handle views from different devices (ie eye displays + observer) [from cabanier via atsushi] 19:19:56 cwilso: Next item 19:20:04 ... Cover 163, Rik? 19:20:20 Rik: Don't have solution to this one 19:20:21 zakim, take up agendum 2 19:20:21 agendum 2. "layers#158 Layers on devices with views of different resolutions; how should we handle views from different devices (ie eye displays + observer)" taken up [from 19:20:24 ... cabanier via atsushi] 19:20:31 ... Now if create texture array, we assume every view has the same resolution 19:20:34 ... Not the case always 19:20:41 ... Camera resolution may be different than the displays 19:20:45 ... Don't know now how to solve for the spec 19:20:47 q+ 19:20:57 ... Should we throw or make things more complicated and allow textures and separate textures 19:20:57 q+ 19:21:02 ... Sounds confusing 19:21:07 ... Would like to hear from alex 19:21:11 ack manish 19:21:20 manish: Also, should let alex go first... 19:21:34 ... Unfortunate result suggested throwing for texture arrays 19:21:45 ... Content supporting multiple views may not realize that using texture arrays is broken here 19:21:59 ... May be nice to request "give me texture arrays otherwise give me normal views" 19:22:11 ... This may require api change that can be backed my multiple texture arrays 19:22:17 q+ 19:22:18 ... When getting a view you get both a texture array and index 19:22:24 ... Not sure which approach is better or necessary 19:22:32 ... Ideally content written for two views will not automatically break 19:22:34 ack alex 19:22:48 Alex: I would hope to get to, with others chiming in... 19:22:55 ... Speaking to how HoloLens does it 19:23:09 ... Optimal path to hit with layers spec -- use texture array for main stereo view. Use mono for separate things 19:23:22 ... Not third element for first person view, separate from texture array 19:23:26 ... Way the hardware works 19:23:48 ... Previous comment Manish made.. Having way for app to get texture array, but can offer something else. Reduce axes of ambiguity 19:24:14 ... If know UA is unable to support... OpenXR backed support runtimes support Texture Arrays, but some may be faster than others 19:24:25 ... Is required to support apps that use the paths, 19:24:29 q+ 19:24:35 ... Should WebXR require both texture array path and non-texture array path 19:24:45 ack baj 19:24:49 ... Once you have promise that textures arrays will work 19:25:04 bajones: I think that for us we are not worried about any UA that does not support texture arrays 19:25:08 ... Should be baseline at this point 19:25:22 ... Reason for Texture Arrays at all is for WebGL 1 based apps to support 19:25:32 ... If didn't care about WebGL 1 would say "texture arrays all the way" 19:25:52 ... Scenerio to consider is not just HoloLens observer view, but also ... [missed] have four views.. Higher inset views 19:25:57 q- 19:26:01 ... People working on [barro?} 19:26:14 ... In both cases, said that all of the primary views that you allocate are same resolution 19:26:31 ... In Barrow, smaller view is higher density but same size. Can be allocated in one go 19:26:38 q+ 19:26:41 q+ 19:26:42 ... HMD vendors seem to be aware of this 19:26:53 ... Encounter a lot of that, but maybe can't count on it, such as with HoloLens observer 19:27:03 ... Manish was wondering if we could return another texture array. Answer is yes 19:27:15 ... API as it is now, here's view and layer. Give me WebGL sub-image 19:27:21 ... Could contain texture array or something else 19:27:41 ... Would strongly recommend that if developer uses texture arrays, that we always support texture arrays. Or if textures always suppor tthat 19:27:48 ... Reduce complexty 19:27:59 ... Maybe have primary views in texture array 19:28:18 ... This will break down if developers are trying to do multi view rendering. Trying to render all views in one batch 19:28:23 ... All but one would be the same texture 19:28:39 ... Some other conversations around those observer views have a lot of discussion about people opting into that 19:28:51 ... Make contract wit h the UA that says, "I am handling these if you give them to me" 19:28:58 ... If that's the case and we're willing to be more explicity 19:29:03 q+ 19:29:23 ... Clearly demark what views are non-primary, then we could get into a comfortable situation where we make guarantee where primary views are part of same texture array 19:29:29 ... MAybe we allow multiple viewports per level 19:29:34 ... There's some more discussion there 19:29:43 ... Do allow for these secondary views to be formatted a bit differently 19:29:54 ... Need to be aware of that when develop it. MAke sure app handles properly and test 19:30:04 ack alex 19:30:13 alex: Comment.. Great point about vario 19:30:17 ... Two notions of WebXR 19:30:21 ... Primary view configs 19:30:24 OpenXR Varjo primary quad views: https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#XR_VARJO_quad_views 19:30:27 ... Mono view device or stereo view device 19:30:39 ... Extension enables new primary, four views 19:30:44 ... Independently notion of secondary views 19:30:45 OpenXR secondary views: https://www.khronos.org/registry/OpenXR/specs/1.0/html/xrspec.html#XR_MSFT_secondary_view_configuration 19:31:02 ... Secondary view system that proposing is reviving some things from spec that was excised for 1.0 19:31:06 ... Bringing back as an extension 19:31:18 ... Introduces more machinery for secondary views 19:31:23 ... May be a simpler opt-in 19:31:30 ... Give me 4 primary views in texture arrays 19:31:42 ... Option to provide primary and secondary array with different resolution and render paths 19:31:49 ... Can't just be another element in the texture array 19:31:52 q? 19:31:59 ... Okay to be a separate context.. This is where we are landing in OpenXR 19:32:07 ack cab 19:32:11 Rik: Sounds like we are all getting agreement that we should not throw 19:32:17 ... And for primary views we should create texture array 19:32:24 ... Secondary views should be texture array but separate 19:32:28 ... PAge should opt in 19:32:33 ... for secondary views 19:32:38 ... Should page opt in for secondary views? 19:32:41 ... Was pushback 19:32:46 ... Now we are starting to differentiate 19:32:51 ... With layers spec 19:32:58 ... In order to do that we need to define that somewhere 19:33:03 ... Should be defined in WebXR spec 19:33:13 manish: This dovetails into. 19:33:17 ack manishearth 19:33:26 ... My PR on spec ties into what OpenxR calls secondary views 19:33:35 ... I and brandon are concerned about primary and secondary language 19:33:39 q+ 19:33:59 ... Additional views concept. In layer spec can say "here's an additional thing you should handle" 19:34:07 ... We could have a line like this here as well 19:34:25 ... Not all cases where you have two views , eg in a C.A.V.E system / first person system, device can reproject 19:34:30 ... Give you that view without surfacing to JS 19:34:39 ... In C.A.V.E. has to give all views. can't extrapolate 19:34:54 ... Similar situation where want to make distinction between multiple primary views and additional views that can be ignored 19:34:54 q+ 19:34:59 ack cab 19:35:03 ... Concept of additional views candles this 19:35:17 Rik: Under impression that additional views means you have more than two views 19:35:27 ... In C.A.V.E. can have 4.. Doesn't seem to match up. 19:35:38 ... Primary views are what observer sees. 19:35:46 manish: Way I tried to spec lets that distinction exist 19:36:00 ... Not clear but the way is specced, additional views means "not your normal views". Can change that 19:36:10 q+ 19:36:13 q- 19:36:13 ... to "primary / secondary" 19:36:13 Rik: Changes make sense 19:36:28 ack alexturn 19:36:29 manish: Concept invented for this may not be useful for layers. Can change that concept so it works 19:36:32 q? 19:36:44 ack bajones 19:37:00 bajones: Say really quickly that I do think distinction between primary and secondary views is more necessary given this conversation 19:37:03 manish: Can do that 19:37:10 q+ 19:37:12 Rik: Can change layers spec so is not so confusing 19:37:16 ack manish 19:37:35 q+ 19:37:39 manish: Quick question - I am defining primary views as things that you must render to. Do we expect a case where primary views have different resolutions? 19:37:47 ... Maybe in a C.A.V.E. system? Might be other systems? 19:37:54 ... Kicking a can for cave systems down the road here? 19:38:10 bajones: Not aware of any systems where primary views are of a different resolution, even if not a different size 19:38:19 ... Even in cave system, is a cube. Sides all the same size 19:38:28 ... Don't see where it would need to be differing 19:38:33 ... OpenxR is ideally where things are trending 19:38:44 ... In that scenario, implemented gets to choose resolution that is passing down 19:38:56 ... WE can probably just decide that they are same resolution, but underrezzed 19:39:01 ... Compositor will figure that out 19:39:09 ... Not overtly concerned, but didn't give thought 19:39:10 ack alexturn 19:39:23 alexturn: Haven't explored on HoloLens 3 element texture array 19:39:27 ... Haven't dug deep there 19:39:36 ... Reasonable to do extra pass. Not quite doing Multiview 19:39:43 ... I think would be interesting to explore 19:39:50 ... Want to prove out that we can get to reasonable performance 19:39:58 ... To see if single texture for all of them 19:40:06 ... Other side for Vario and StarVR. 19:40:17 ... Wide-left and wide-right... 19:40:21 ... Nuance for additional discussion 19:40:35 ... Meaningful for runtime to know in advance.. If you opt in 19:40:51 ... IF not opt in for starVR, stereo covers central display and some of wide displays 19:40:56 ... If didn't opt in, get the same for all displays 19:41:05 ... Core 0,1 views L,R inner displays are not exactly that mapping 19:41:14 ... Need to know as a runtime/UA to know which FOV to give you 19:41:30 ... Keep in mind that is useful / valuable for runtime to know if opt-in to operate the primary views 19:41:38 Rik: Move to next item? 19:41:39 next agendum 19:41:40 Rik: Yes 19:41:47 move to agendum 2 19:42:04 Rik: Currently only 2 views 19:42:08 ... One for left and right 19:42:14 ... GetSubItems for one image 19:42:20 ... Working on supporting more views 19:42:23 ... Updating layers spec 19:42:28 ... Incorrect assumption 19:42:39 ... Associating with view for stereo is incorrect assumption 19:42:51 ... Stereo means left or right, not anything to do with the view 19:42:53 rrsagent, please publish minutes v2 19:42:53 I have made the request to generate https://www.w3.org/2020/06/16-immersive-web-minutes.html atsushi 19:43:01 ... Created a PR 19:43:02 https://github.com/immersive-web/layers/pull/165 19:43:07 ... To fix this 19:43:17 ... This change makes it so views are only applicable to projection layers 19:43:25 ... And you have to pass an XREye to getSubImage 19:43:37 ... Depending on if stereo or mono, pass in L+R 19:43:42 ... Manish commented on PR 19:43:45 ... Big change 19:43:52 ... MAke sure everyone is okay with this change 19:43:55 i/Rik: Currently only 2 views/topic: layers#163 Should non-projection layers support more than 2 views?; Should non-projection layers support more than 2 views? [from cabanier]/ 19:43:56 ... Let me know if you have concerns 19:44:00 q+ 19:44:13 bajones: I haven't reviews the PR yet 19:44:16 ... Will do that after call 19:44:17 ack brajones 19:44:29 ... Worth calling out as sanity check assumption in 163 19:44:43 ... Where saying that any non-projection view, having more than two views 19:44:46 ... doesn't make sense 19:44:54 ... Mostly going to be used for static 3d images or movies 19:44:59 ... Where isn't more than two views available 19:45:09 ... My assumption on top of that is that if you wanted to do something like a portal 19:45:14 ... Might not be best place for layering system 19:45:14 q+ 19:45:21 ack ba 19:45:22 ... Probably want to use more traditional means with stencil 19:45:34 ... Worth doing quick feeling for room to see if everyone agrees with those assertions 19:45:49 ... With quad and cylinder, anything more than a strict L+R stereo does not make sense 19:45:50 q? 19:45:54 ... Anyone disagree with that 19:46:14 ack Manisheart 19:46:16 manish: My initial pushback was based on not understanding why non-projection views would want stereo 19:46:23 ... 3d nature of that would be very weird 19:46:32 q+ 19:46:33 ... In case of taking existing content formatted like that, maybe useful 19:46:36 ... Agree with PR 19:46:46 alexturn: Generally I think this will cover most use cases 19:46:48 ack alexturn 19:47:05 ... If you make a quad layer, you can say if on left-eye-view or right-eye-view, but don't control view visibility 19:47:11 ... Might be okay 19:47:28 ... May use a quad to do an observer view. Maybe observer view renders differently 19:47:36 q+ 19:47:40 ... Do we need that flexibility for things like quad layers, or say that it appears in all views? 19:47:55 ... Maybe complexity can be added layer if we need it. Calling out that this limitation exists with this approach 19:47:58 ack cab 19:48:00 Rik: I don't think this is limitation 19:48:06 ... Quad layer will still be composited 19:48:10 ... If in world space of camera 19:48:14 ... Will be in correct spots 19:48:26 ... The only thing that would.. if stereo layer, which would you pick 19:48:47 rik: do we have cases where we would want to exclude the quad layers? 19:49:02 ... If just projection layers, then app may exclude arbitrary geometry in scene to tweak for observers 19:49:16 ... Lost flexibility.. Once using quad layer, then content must appear in all views 19:49:23 ... Don't have concrete example of what would be blocked 19:49:32 ... Functionality in OpenXR that is not represented here 19:49:42 ... Not strong objection, but is an artifact to note here 19:49:47 ... Worth filing for async discussion 19:52:09 i/manishearth: Clarifications/scribe: kip/ 19:52:12 rrsagent, please publish minutes v2 19:52:12 I have made the request to generate https://www.w3.org/2020/06/16-immersive-web-minutes.html atsushi 20:01:47 i/brandon: You can probably successfully /scribe: kip/ 20:01:50 rrsagent, please publish minutes v2 20:01:50 I have made the request to generate https://www.w3.org/2020/06/16-immersive-web-minutes.html atsushi 20:02:21 i/bajones: the math for this/scribe: Manishearth/ 20:02:23 rrsagent, please publish minutes v2 20:02:23 I have made the request to generate https://www.w3.org/2020/06/16-immersive-web-minutes.html atsushi 20:12:12 s|agendum 3. "layers#163 Should non-projection layers support more than 2 views?; Should non-projection layers support more than 2 views?" taken up [from cabanier via atsushi]|| 20:12:16 rrsagent, please publish minutes v2 20:12:16 I have made the request to generate https://www.w3.org/2020/06/16-immersive-web-minutes.html atsushi