IRC log of immersive-web on 2023-09-15

Timestamps are in UTC.

09:44:18 [RRSAgent]
RRSAgent has joined #immersive-web
09:44:22 [RRSAgent]
logging to https://www.w3.org/2023/09/15-immersive-web-irc
09:44:22 [tidoust]
tidoust has joined #immersive-web
09:46:37 [yonet]
topic: https://github.com/immersive-web/spatial-favicons/issues/7
09:46:37 [ada]
present+
09:46:37 [ada]
q+
09:46:37 [Zakim]
Zakim has joined #immersive-web
09:47:08 [tidoust]
present+
09:47:09 [yonet]
present+
09:47:09 [atsushi]
rrsagent, make log public
09:47:10 [ada]
q+
09:47:10 [ada]
q-
09:47:17 [ada]
present+
09:47:17 [atsushi]
meeting: Immersive Web WG/CG TPAC 2023 Day3
09:47:17 [vicki]
present+
09:47:17 [Brandel]
present+
09:47:17 [ada]
scribenick: vicki
09:47:22 [gombos]
gombos has joined #immersive-web
09:47:30 [gombos]
Present+ Laszlo_Gombos
09:47:41 [vicki]
ada: this might be a long one. we have a repo in the CG for spatial favicons, not a lot has happened.
09:47:44 [atsushi]
agenda: https://github.com/immersive-web/administrivia/tree/main/TPAC-2023#agenda
09:47:48 [atsushi]
rrsagent, publish minutes
09:47:49 [RRSAgent]
I have made the request to generate https://www.w3.org/2023/09/15-immersive-web-minutes.html atsushi
09:47:56 [shiling]
shiling has joined #immersive-web
09:48:04 [vicki]
... maybe the favicon for the web app manifest, maybe for the page
09:48:06 [atsushi]
TPAC 2023 Day2 minutes -> https://www.w3.org/2023/09/12-immersive-web-minutes.html
09:48:40 [atsushi]
topic: Agenda bashing
09:48:40 [vicki]
... might not totally be within our charter but let's talk about tit
09:48:40 [vicki]
s/tit/it
09:48:57 [cabanier]
q+
09:49:00 [vicki]
ada: seems like a very cool thing. could user agents punch out to an SVG to a blobby thing to make a 3d favicon look good?
09:49:05 [atsushi]
rrsagent, publish minutes
09:49:06 [RRSAgent]
I have made the request to generate https://www.w3.org/2023/09/15-immersive-web-minutes.html atsushi
09:49:12 [yonet]
ack cabanier
09:49:15 [ada]
ack cabanier
09:49:21 [vicki]
... any thoughts? or should we archive this?
09:49:51 [atsushi]
s/topic: Agenda bashing//
09:50:02 [vicki]
cabanier: magic leap implement this a few years ago, made orig proposal
09:50:12 [atsushi]
s|TPAC 2023 Day2 minutes -> https://www.w3.org/2023/09/12-immersive-web-minutes.html||
09:50:14 [vicki]
... that the time there were a lot of questions
09:50:22 [vicki]
...icons are underspecified
09:50:30 [vicki]
they are not fetched by the page, by the browser
09:50:41 [vicki]
... I'm not sure that matters
09:50:49 [atsushi]
i|topic: https://github.com/immersive-web/spatial-favicons/issues/7|TPAC 2023 Day2 minutes -> https://www.w3.org/2023/09/12-immersive-web-minutes.html
09:50:50 [atsushi]
rrsagent, publish minutes
09:50:52 [RRSAgent]
I have made the request to generate https://www.w3.org/2023/09/15-immersive-web-minutes.html atsushi
09:50:59 [vicki]
as long as we don't say, let's support more mime types
09:51:11 [ada]
q?
09:51:12 [vicki]
I think there's a nice fallback mechanism that would be nice to see
09:51:50 [vicki]
ada: I think it makes sense. I want to just drop in my Cistine chapel.gltf
09:52:06 [vicki]
cabanier: it would be great if there were an organization or association to define that
09:52:44 [vicki]
ada: maybe the Alliance for Open USD
09:52:44 [Brandel]
q+
09:52:44 [vicki]
... something with a very limited feature set, very small
09:52:47 [vicki]
... I'd love to see what people do with a restricted medium like this
09:52:47 [ada]
ack Brandel
09:53:14 [vicki]
Brandel: AOUSD initial agreement is to go through current material and turn it into a standard
09:53:28 [vicki]
... and next, to find out where other relevant standards are necessary
09:53:30 [vicki]
... web is next
09:55:02 [vicki]
... one benefit of NURBS is that they can be used to represent low-res things
09:55:08 [ada]
q?
09:55:41 [vicki]
ada: it would be fun to see the standards evolve for small model formats
09:56:08 [vicki]
cabanier: would it make sense to reach out to the working group to see
09:56:18 [vicki]
ada: that would be a good idea, this kind of thing takes a long time
09:56:43 [vicki]
... work out the WHATWG for page favicons, or Web App WG for web app manifest
09:57:16 [vicki]
... wasn't there something like, "its not up to us to decide what formats to do". wasn't there a list somewhere?
09:57:26 [vicki]
.. this was pre-model
09:58:07 [vicki]
cabanier: if image supports this, maybe support for model too
09:58:21 [vicki]
ada: browsers that don't support it could just ignore it
09:58:35 [vicki]
... if someone were to implement it...
09:58:44 [vicki]
cabanier: ideally there would be a document somewhere to specify
09:58:57 [vicki]
ada: esp now that SVG favicons are supported
09:59:22 [vicki]
ada: do people still feel positive? we'd probably be the lynchpin
10:00:07 [vicki]
... should we do a CG repor to establish what it means to do a small one?
10:00:07 [vicki]
...we can do it in the spatial favicons repo
10:00:13 [vicki]
... establish some base guidelines for what a small model should be
10:00:54 [vicki]
... guidelines for both gltf and usd
10:01:09 [vicki]
... this is approx what we think it should do. mention NURBS, mention low-poly, etc
10:01:20 [cabanier]
proposal for size constraints from magic leap: https://ml1-developer.magicleap.com/en-us/learn/guides/portal-icon-guidelines
10:01:40 [vicki]
...I don't think the spatial favicons effort will ever leave the CG, more like "this is a things we think" vs a standard
10:01:51 [vicki]
... +1 if you think this is a good idea
10:01:53 [yonet]
0
10:01:57 [etienne]
+1 for "small models"
10:01:58 [vicki]
+1
10:01:59 [bialpio]
+0.25
10:02:01 [gombos]
+1
10:02:05 [cabanier]
+1
10:02:15 [vicki]
... 0 for neutral, -1 if against
10:02:55 [vicki]
cabanier: I'm a little worried bc I don't know who is making the right decision, there could be arguing g
10:03:10 [vicki]
...maybe we'll end up just doing our own thing
10:03:38 [vicki]
ada: the history of favicon is pretty loose, I think someone from Netscape was just like "here's a thing I added" one evening
10:03:51 [vicki]
...this might be all of the discussion we need around this
10:04:18 [vicki]
...let's pick up another 15 min issue
10:05:08 [ada]
https://github.com/immersive-web/administrivia/issues/201
10:05:08 [yonet]
topic: https://github.com/immersive-web/administrivia/issues/201
10:05:41 [vicki]
yonet: is there anything else we should add/ask from our community? there's a Help Wanted tag on this
10:05:41 [vicki]
...one is types
10:05:41 [vicki]
cabanier: I think editors have not really gone through this, some issues are very old
10:05:45 [vicki]
...maybe we should start by going through this
10:06:36 [yonet]
https://github.com/immersive-web/webxr/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22
10:07:00 [vicki]
ada: it would be nice if /help added the Help Wanted label
10:07:50 [vicki]
...should I remove the HW thing from email? do people actually read it?
10:08:19 [vicki]
cabanier: maybe assign it to someone and they get an email every day
10:08:46 [vicki]
yonet: if we have anything to ask we could promote it a little
10:08:56 [vicki]
ada: I guess now we are getting closer to Rec
10:09:24 [vicki]
...maybe it could be useful for like, "Hey, we want feedback for this" if someone wants to leave comments, it would be welcome on these issues
10:09:52 [vicki]
...action: add a new label for Needs Feedback, put that in the email instead, and add a new command for people to request feedback
10:10:14 [vicki]
...I don't think anyone reads the mailing list. or the emails I send out! :p
10:10:24 [vicki]
...I guess people do turn up for the F2F
10:10:38 [vicki]
...I think this will be a good action for us to do (yonet agrees)
10:11:32 [vicki]
... BTW if anyone in this room wants navigation to work, feel free to drop in and work on it!
10:14:39 [vicki]
break time! \o/
10:17:02 [CharlesL]
CharlesL has joined #immersive-web
10:17:12 [CharlesL]
present+
10:29:10 [tidoust]
tidoust has joined #immersive-web
10:30:46 [etienne]
etienne has joined #immersive-web
10:31:01 [cabanier]
scribenick: cabanier
10:31:01 [bkardell_]
present+
10:31:08 [cabanier]
ada: thank you for joining us
10:31:21 [cabanier]
... we'd like to expose more a11y through webxr
10:31:22 [jamesn]
jamesn has joined #immersive-web
10:31:30 [jamesn]
present+
10:31:40 [ada]
present+
10:31:45 [etienne]
present+
10:31:49 [yonet]
present+
10:32:26 [cabanier]
... for new to webxr, it's an API that sits on top of xr
10:32:46 [cabanier]
... some devices are attached to a computer but most frequently, the display is the computer
10:32:53 [cabanier]
... such as the vision pro and quest
10:32:56 [vicki]
present+
10:33:03 [cabanier]
... or augmented reality on smart phone
10:33:29 [cabanier]
... the issue we have with adding a11y to WebXr is that we are on top of WebGL which has no a11y features
10:33:41 [cabanier]
... we have some api's that can increase semantics
10:33:51 [cabanier]
... the pixels on the screen are rendered by the user
10:33:58 [cabanier]
... we need help with some of these issues
10:34:21 [cabanier]
... the other item we are working on, unrelated to WebXR is a new API called the <model> element
10:34:31 [cabanier]
... it's a new html element that can display 3D model
10:34:34 [Peter]
Peter has joined #immersive-web
10:34:41 [cabanier]
.. they are no immersive but can be integrated in the page
10:34:57 [cabanier]
jamesn: is this what we discussed at the last TPAC?
10:35:31 [cabanier]
ada: yes
10:35:31 [yonet]
Accessibility considerations: https://github.com/immersive-web/webxr/blob/main/accessibility-considerations-explainer.md
10:36:11 [cabanier]
ada: it would be a good idea for the developer to expose what is on the screen at the time
10:36:25 [cabanier]
... so we can report back to the user what we are looking at
10:36:38 [jocelyntran]
jocelyntran has joined #immersive-web
10:36:52 [cabanier]
... the naive thing would be to label everything on the screen but it doesn't give you the context
10:37:09 [cabanier]
... the developer can confer that information on what they're rendering on the screen
10:37:24 [cabanier]
... if there was a way to properly get that information out
10:37:48 [vicki]
q+
10:37:59 [yonet]
ack vicki
10:38:05 [cabanier]
... this api should use the proper tools for the job
10:38:31 [jamesn]
q+
10:38:50 [yonet]
ack jamesn
10:38:57 [cabanier]
vicki: are there other elements where you're drawing with webgl, what do the other a11y tools look like
10:39:16 [cabanier]
jamesn: canvas had an experiment with html overlays
10:39:36 [cabanier]
... so rather than trying every line being available, you could do potentially do something
10:39:44 [bkardell_]
q+
10:39:51 [cabanier]
... so there's no single description for an entire screen
10:40:09 [bkardell_]
https://www.bbc.co.uk/accessibility/forproducts/xr/presentations/
10:40:18 [cabanier]
bkardell_: could someone break down if there is research or experiments in this area?
10:40:35 [cabanier]
... there are numerous other things that have been explored
10:40:44 [yonet]
q+
10:40:51 [cabanier]
ada: I wish that the people from xr access where here
10:40:59 [CharlesL]
q+
10:41:12 [cabanier]
yonet: one thing that they did, it was to say what the objects are around you
10:41:14 [cabanier]
q+
10:41:25 [tidoust]
q+
10:41:30 [ada]
ack bkardell_
10:41:30 [cabanier]
... that is one of the challenges but they didn't have solutions for that
10:41:33 [ada]
ack yonet
10:41:36 [ada]
ack CharlesL
10:41:49 [cabanier]
CharlesL: xr access has been doing a lot of research at cornell
10:41:54 [CharlesL]
https://xraccess.org/workstreams/prototype-for-the-people/
10:42:03 [cabanier]
... they have prototype
10:42:15 [cabanier]
... they want people to experiment
10:42:32 [cabanier]
... and need help. They broke it down for the different types of disability
10:42:50 [cabanier]
... the developer knows the scene such as a meeting room
10:43:17 [cabanier]
... the user could have preferences on how the room is laid out
10:43:23 [ada]
q+
10:43:33 [cabanier]
... depending on the level they are querying, the metadata can be there
10:43:33 [ada]
ack cabanier
10:43:37 [ada]
scribenick: ada
10:43:45 [yonet]
ack cabanier
10:44:47 [jamesn]
q+
10:44:49 [ada]
cabanier: I know that at TPAC Japan an example from 2D canvas that was only implemented in firefox for an a11y tree that when you draw pixels you also render a semntic tree.
10:45:40 [ada]
cabanier: screenreaders would need to be existed with it to work and shouldn't be too bad for developers. It wouldn't be hard for developers to render it. But it would fall to the developers.
10:45:49 [cabanier]
scribenick: cabanier
10:45:53 [yonet]
ack tidoust
10:46:21 [cabanier]
tidoust: I want to find solutions to expose the a11y information
10:46:22 [bkardell_]
q+
10:46:36 [cabanier]
... so we need to teach developers on how to expose this
10:46:43 [cabanier]
... if there's information in USD
10:47:12 [cabanier]
... how can this format be extended to give accessible information
10:47:23 [cabanier]
.. that can easily be targeted at developers
10:47:41 [yonet]
ack ada
10:47:43 [cabanier]
yonet: I'm concerned that if you have a model, there are still a lot of other things around you
10:47:52 [cabanier]
ada: we can do both
10:48:07 [cabanier]
... webxr has a long time problem that needs to be solved
10:48:32 [cabanier]
... for both webxr and model
10:48:43 [cabanier]
... adding a11y information to the models is important
10:49:13 [cabanier]
ada: as cabanier said, having a render buffer where you render colors referencing a tree
10:49:32 [CharlesL]
q+
10:49:41 [cabanier]
... this could work really well. You can probably do it within a single render
10:49:45 [cabanier]
q+
10:49:58 [yonet]
ack jamesn
10:50:01 [cabanier]
... it could be a good approach to see if it makes sense
10:50:11 [cabanier]
jamesn: I want to jump back to canvas
10:50:35 [cabanier]
... if we do something with a dom renderer, having something that is overlaid
10:50:45 [cabanier]
... rather than something separately
10:51:06 [cabanier]
ada: unfortunately there are issues here because you can't show html in webxr
10:51:22 [cabanier]
... we've been trying to fix this over the year but extremely hard
10:51:42 [cabanier]
... dom content is typically rectangular which doesn't match webgl
10:52:07 [cabanier]
... you can have more advantage if you have a pixel map
10:52:15 [yonet]
ack bkardell_
10:52:29 [cabanier]
bkardell_: I mentioned the research
10:52:37 [bkardell_]
https://abilitynet.org.uk/news-blogs/apple-vision-pro-has-vision-disabled-inclusion
10:52:47 [cabanier]
... apple launched the vision pro with a lot of a11y
10:53:19 [cabanier]
... it would be great if people could share your a11y research
10:53:34 [cabanier]
... we need to gather more information and do studies
10:53:47 [cabanier]
... hard to come up with answers in this room
10:54:12 [yonet]
ack CharlesL
10:54:17 [cabanier]
yonet: the meta research team wanted to present but didn't get permission on time
10:54:41 [cabanier]
CharlesL: can we have different rendering on top would be nice for color blind people
10:55:09 [cabanier]
... the mention of eye tracking, my eyes shake so I'm worried about this technology
10:55:59 [cabanier]
yonet: for hololens we weren't reactive to the eye movements. The system itself would also train on you
10:56:02 [yonet]
ack cabanier
10:56:11 [cabanier]
CharlesL: I'm using different headsets to try this out
10:56:11 [ada]
scribenick:ada
10:57:59 [bkardell_]
q+
10:58:22 [yonet]
ack bkardell
10:58:25 [ada]
cabanier: The tree itself would be written out into the DOM, but the colours would map to the elements in this tree.
10:58:31 [cabanier]
bkardell_: I assume that this works on simple canvases
10:58:47 [ada]
yonet: it would be good if the same view was used for everyone
10:58:48 [cabanier]
... does this work on more complex content
10:59:23 [ada]
cabanier: the colour index buffer is just used for mapping pixels to elements rather than being visible
10:59:28 [ada]
q+
11:00:25 [ada]
the canvas dom overlay technique, it works for a certain size but does it actually work for things are infinitely big e.g. google maps? Or an open world game with a massive amount of data.
11:00:28 [yonet]
ack ada
11:01:07 [ada]
cabanier: as you walk through the universe you would update the DOM
11:01:17 [ada]
bkardell_: does this work today
11:01:27 [ada]
cabanier: it was removed, it was liked but it was removed
11:01:36 [ada]
q-
11:01:56 [cabanier]
ada: you would only apply colors to the leaf nodes?
11:02:42 [yonet]
q+
11:03:09 [CharlesL]
q+
11:04:28 [yonet]
ack yonet
11:04:35 [yonet]
ack CharlesL
11:04:53 [cabanier]
CharlesL: I was trying to figure out, is there an aria type solution?
11:05:40 [cabanier]
ada: is there a way to output arbitrary information for a screen reader
11:05:40 [yonet]
q+
11:05:44 [cabanier]
... today, it has to be a dom tree and the user has to traverse it
11:05:45 [jamesn]
q+
11:05:58 [cabanier]
CharlesL: my thinking we only have screen readers for headsets
11:06:12 [cabanier]
... do we need screen readers for headset?
11:06:29 [cabanier]
ada: the headset can be a dumb screen that is attached to a computer
11:06:44 [cabanier]
... it's up to the browser itself
11:07:16 [cabanier]
bkardell_: are you sure? This is needed for all the apps
11:07:16 [cabanier]
... for browsers, you have an OS level a11y tree
11:07:52 [cabanier]
... with the exception of vision pro, they're all android based so could use virtual touch
11:08:06 [cabanier]
yonet: it would be great what was done for canvas
11:08:20 [cabanier]
... and would like to know the challenges in the XR space
11:08:30 [yonet]
ack yonet
11:08:32 [cabanier]
... how can we work with the canvas group
11:08:50 [cabanier]
jamesn: the aria group is not necessarily the right group
11:08:58 [cabanier]
... we know a11y api very well
11:09:30 [jcraig]
jcraig has joined #immersive-web
11:09:50 [cabanier]
... it's unclear if they have accessibility APIs
11:09:57 [cabanier]
... apart from vision pro
11:10:06 [yonet]
q?
11:10:09 [cabanier]
... I'd love to be involved in this conversation
11:10:14 [yonet]
ack jamesn
11:10:27 [jcraig]
rrsagent, make minutes
11:10:28 [RRSAgent]
I have made the request to generate https://www.w3.org/2023/09/15-immersive-web-minutes.html jcraig
11:11:44 [CharlesL]
https://www.w3.org/TR/xaur/
11:11:54 [cabanier]
topic: semantic labels
11:11:58 [ada]
scribenick: ada
11:12:56 [jcraig]
s/ the aria group is not necessarily the right group/ the APA group is not necessarily the right group/
11:14:07 [ada]
cabanier: we added support for semantic labelling in webxr with gives you information about the room you are in, when the user sets up their device they can annotate their environment floors, tables, walls, paintings, this information is giving to WebXR and used to prevent users from running into objects. It seems useful even at a system level. This is useful for everybody. There are
11:14:08 [ada]
only a limited set of labels right now. E.g. No ottoman label.
11:14:16 [CharlesL]
q+
11:14:23 [ada]
ack CharlesL
11:14:46 [jcraig]
s/it's unclear if they have accessibility APIs/it's unclear if the people in the APA group have experience with accessibility APIs, those with the relevant API experience are more likely to be in ARIA/
11:14:51 [jcraig]
rrsagent, make minutes
11:14:53 [RRSAgent]
I have made the request to generate https://www.w3.org/2023/09/15-immersive-web-minutes.html jcraig
11:15:30 [jcraig]
q+
11:15:30 [yonet]
Semantic labels: https://github.com/immersive-web/semantic-labels
11:15:30 [ada]
CharlesL: is this done through a W3C registry?
11:15:40 [ada]
cabanier: right now it's in a repo people can contribute to
11:16:14 [CharlesL]
q?
11:16:15 [ada]
CharlesL: intnl, the hardware can do it
11:16:26 [yonet]
ack jcraig
11:16:32 [ada]
cabanier: the hardware can do it
11:16:49 [jcraig]
https://www.aswf.io/blog/community-support-needed-for-new-accessibility-initiative/
11:17:02 [myles]
myles has joined #immersive-web
11:17:58 [ada]
james craig: Apple have been working with the ??? foundation to improve a11y. They are going to be working with the USD format to... [connection lost]
11:18:01 [CharlesL]
rrsagent, make minutes
11:18:02 [RRSAgent]
I have made the request to generate https://www.w3.org/2023/09/15-immersive-web-minutes.html CharlesL
11:18:22 [yonet]
https://github.com/immersive-web/webxr-accessibility
11:18:47 [atsushi]
i/ada: you would only apply colors to the leaf/scribe+ cabanier/
11:18:56 [atsushi]
rrsagent, publish minutes
11:18:58 [RRSAgent]
I have made the request to generate https://www.w3.org/2023/09/15-immersive-web-minutes.html atsushi
11:19:11 [atsushi]
scribe- cabanier
11:19:16 [jcraig]
https://wiki.aswf.io/plugins/servlet/mobile?contentId=74940495#content/view/74940495
11:21:57 [jcraig]
Quoting from last link…
11:22:47 [jcraig]
USD Accessibility needs :
11:22:47 [jcraig]
- Method to label an object (container or leaf node)
11:22:48 [jcraig]
- Method to label a time range (possibly use a caption format? like VTT metadata)
11:22:48 [jcraig]
- If this format can include video and/or audio, how is loc/lang handled? We may need something similar same for captions and audio descriptions.
11:22:54 [jcraig]
rrsagent, make minutes
11:22:55 [RRSAgent]
I have made the request to generate https://www.w3.org/2023/09/15-immersive-web-minutes.html jcraig
11:24:11 [jcraig]
s/Apple have been /Apple has been /
11:25:12 [jcraig]
s/??? foundation/Academy Software Foundation (ASWF)/
11:25:34 [jcraig]
rrsagent, make minutes
11:25:36 [RRSAgent]
I have made the request to generate https://www.w3.org/2023/09/15-immersive-web-minutes.html jcraig
11:52:33 [myles]
myles has joined #immersive-web
12:15:26 [myles]
myles has joined #immersive-web
12:23:46 [tidoust]
tidoust has joined #immersive-web
12:25:28 [myles]
myles has joined #immersive-web
12:28:55 [atsushi]
atsushi has joined #immersive-web
12:35:39 [bialpio]
bialpio has joined #immersive-web
12:37:54 [bialpio]
present+
12:37:58 [atsushi]
rrsagent, publish minutes
12:37:59 [RRSAgent]
I have made the request to generate https://www.w3.org/2023/09/15-immersive-web-minutes.html atsushi
12:38:05 [ada]
scribenick: ada
12:38:05 [cabanier]
present+
12:38:05 [alcooper]
present+
12:38:35 [yonet]
yonet has joined #immersive-web
12:38:39 [yonet]
topic: https://github.com/immersive-web/webxr-ar-module/issues/90
12:39:49 [ada]
cabanier: this issue is something we have noticed that start of in VR that then progress into AR, so for a big portion of AR you are doing all the AR things, cameras, depth, reprojection which is a lot of wasted power.
12:39:51 [ada]
q+
12:39:54 [bialpio]
q+
12:40:17 [ada]
cabanier: I Don't think it is something we the browser can figure out
12:40:26 [yonet]
scribenick:yonet
12:40:51 [yonet]
ada: It might be more useful to change session time within the session
12:41:50 [yonet]
...if you can pre ask the features. It might be request a change for XR session. I think it would be a more useful way , like going from inline to iimmersive modes
12:42:02 [alcooper]
q+
12:42:09 [ada]
ack ada
12:42:32 [yonet]
rik: you are proposing while in session, a new request for a different session?
12:43:46 [yonet]
ada: or devs can request for more than one immersive sesion and we give the new xr session.
12:43:46 [yonet]
...I don't think it would break any current usage
12:44:09 [yonet]
...it would be something like xrsession.transition or something
12:44:23 [yonet]
rik: I think it would be awkward
12:44:53 [yonet]
ada: if you put it as an optionall feature
12:45:02 [yonet]
rik: you can't make it an optional feature
12:45:57 [yonet]
ada: if we request as a feature, it would be like showing another room when you want to switch the mode.
12:46:11 [yonet]
Rik: I would prefer if it was in the render state.
12:46:42 [yonet]
ada: when you are in immersive-vr you are not going to test for hit test or anything
12:47:19 [yonet]
rik: if you know what the room looks like, you could do hit testing on the room. You want to keep the environment if you want to keep the AR mode
12:47:36 [yonet]
...creating a new session will have discontinuity
12:48:30 [yonet]
ada: I think you are right, especially on handheld, in that situation what you want is still be in the handheld mode but switch from ar to vr
12:48:48 [yonet]
rik: do you think a flag?
12:49:20 [yonet]
ada: I don't think every platform can do it instantanously, so it should be a promise
12:50:02 [yonet]
...if it is system app start taking over doing ar stuff and you couldn't get access again for immersive ar session
12:50:27 [yonet]
rik: yes. I think I can see it both ways.
12:51:01 [yonet]
ada: for some things like change the mode from transparency
12:51:20 [yonet]
rik: Hololens and a device like that would fail the promise.
12:52:03 [yonet]
ada: you might also want to fire it as an optional feature, that way the device can get ready for the session change
12:52:07 [yonet]
q?
12:52:16 [yonet]
ack bialpio
12:53:22 [yonet]
bialpio: vr to ar is tricky. I don't know if we want to allow that. How do you display permission priompt
12:53:48 [yonet]
ada: you can still have ar session. You call the session immediately
12:54:19 [yonet]
bialpo: we don't need that feature now right.
12:55:20 [yonet]
ada: For the escape room case, for the situation where you fail the camera. More work need to be done. You say this is reserved
12:55:20 [vicki]
vicki has joined #immersive-web
12:55:55 [yonet]
bialpo: for the vr, the device can render over everything. After the ar mode, it will render everything over again. I think this is over optimization
12:56:08 [yonet]
ada: you are not going to get the camera feed instantly
12:56:18 [yonet]
...we will need a transition state
12:57:25 [yonet]
bialpo: I agree if update render state might not know if the camera is rendered or not, for those cases we might need a promise
12:58:18 [yonet]
...I think we will know if we are on additive devices or not. This case is not enough for the feature as a promise.
12:59:08 [yonet]
...for hit test we don't require ar session. you could expose hit test. the only thing is that we don't have a way of hit testing in vr in virtual objects
12:59:17 [cabanier]
q+
12:59:18 [yonet]
q?
12:59:25 [yonet]
ack alcooper
12:59:27 [bialpio]
bialpio has joined #immersive-web
12:59:37 [ada]
scribenick: ada
12:59:39 [Leonard]
Leonard has joined #immersive-web
12:59:53 [Leonard]
present+
13:00:12 [bialpio]
q+
13:00:14 [ada]
alcooper: is there anythign in OpenXR that has an example for how they have done this and whether they actually can stop doing work even if we ask it not to.
13:01:08 [ada]
alcooper: I think as long as the session is going the camera will still be going but that could just be for our implementation,
13:01:31 [ada]
alcooper: I don't see going from immersive-ar to immersive-vr beign something feasible since they are essentially two seperate runtimes.
13:01:36 [yonet]
ack cabanier
13:02:08 [cabanier]
q+
13:03:19 [ada]
cabanier: on OpenXR there is a system ready call and do passthrough, and a pass through layer which can be removed from the lift of layers.
13:03:53 [ada]
cabanier: to bialpio since it could be a hint for additive devices they can just fulfill the promise immediately.
13:04:11 [yonet]
ack bialpio
13:04:31 [alcooper]
q+
13:04:42 [ada]
bialpio: I think this makes sense, I wanted to add some details about ARCore in which case i think we can immediately switch at least in one direction or maybe both for the camera availability.
13:05:20 [ada]
alcooper: Yeah we should be able to do that but we may at least save a little optimisation although it may do it anyway.
13:05:39 [ada]
bialpio: yeah it might be a very small performance improvement for android
13:06:58 [cabanier]
q+
13:07:01 [ada]
bialpio: there might be some features that the site may want access to which we may be able to disable but I don't think we would be able to disable these on the fly in ARCore they need to be set on session creation which add latency to the camera such as image tracking which add a couple of ms of delay. Which is noticable, we probaby wouldn't be able to turn these off.
13:07:15 [yonet]
ack cabanier
13:07:25 [yonet]
ack alcooper
13:07:37 [ada]
cabanier: i agree that depth sensing or image tracking are turned on then it won't do anything
13:08:43 [ada]
alcooper: one thought i had regarding passthrough, would it make sense to have pass through be a request in immersive-vr too so pass through is just a feature that can be turned on or off
13:09:19 [yonet]
scribenick: yonet
13:09:39 [bialpio]
q+
13:10:01 [yonet]
cabanier: I don't think the developer thinks of their app as vr.
13:11:15 [cabanier]
q+
13:11:33 [ada]
scribenick: ada
13:11:37 [yonet]
alcooper: to rik's point, maybe it is weird to have these ar vs vr with the pass through
13:11:47 [yonet]
ack bialpio
13:13:18 [ada]
bialpio: i think this is starting to blur the lines between AR and VR I don't think many of the feature sassociated with AR are blocked in VR sessions and it's worth while considering AR sessions as more sensitive in terms of privacy so perhaps some features should be restricted to AR only.
13:13:18 [ada]
q?
13:13:18 [alcooper]
q+
13:13:18 [ada]
q+
13:13:19 [yonet]
ack cabanier
13:14:52 [ada]
cabanier: one more reason they should start from AR then disable, is preerving the mode type in android since we don't want people starting in cardboard mode then going AR
13:14:52 [ada]
bialpio: also it would need to solve getting the permission prompt
13:15:25 [yonet]
ack alcooper
13:15:26 [ada]
cabanier: it would also cause the session to grid to a halt for a few seconds
13:15:32 [Leonard]
[AFK, BRB]
13:15:55 [bialpio]
q+
13:16:10 [ada]
alcooper: if it is a hint we need to have someway to show it can fail e.g. if you are doing raw camera and ask us to disable your camera that should probably fail
13:16:11 [yonet]
scribenick: yonet
13:16:15 [ada]
q-
13:16:28 [yonet]
ack bialpio
13:17:16 [ada]
bialpio: re raw camera access we would need to start handing off camera frames whilst the user cannot see what is being visible by the camera.
13:17:19 [ada]
q+
13:17:37 [yonet]
ack ada
13:18:14 [yonet]
ada: if you request a raw camera access, what is stopping the dev to render a big square to hide the environment
13:18:53 [ada]
alcooper: for what it's worth you could do something with getUserMedia.
13:19:06 [alcooper]
s/something/the same thing
13:19:22 [yonet]
topic: https://github.com/immersive-web/webxr/issues/1345
13:20:07 [yonet]
ada: I've chat with hdr experts. I've been trying to work out what we need to expose to the user
13:20:25 [eeeps]
eeeps has joined #immersive-web
13:20:41 [yonet]
...there are some xr devices, like vision pro, meta quest pro, have a display that is hdr
13:20:53 [yonet]
...currently all xr sessions are srgb
13:21:28 [yonet]
...what was it that you explained about quest pro? do you just stretch out the color space?
13:21:30 [Leonard]
q+
13:22:21 [yonet]
rik: most websites do it in srgb but spec says it is linear
13:23:23 [yonet]
...I think this is how webgl works too.
13:23:42 [yonet]
...we were creating linear colorspace. some browsers implementation colors are wrong
13:24:00 [yonet]
ada: what Rik told me gl layers can be floats
13:24:14 [yonet]
rik: spec does not allow floats but can be updated
13:24:41 [yonet]
...webgl can render to floating point rgb buffer for hdr, that is totally possible
13:24:56 [yonet]
...if not, we need to request the color spaces
13:25:30 [yonet]
ada: I was thinking having a prefered color spaces. list of supported ones
13:26:21 [yonet]
...there needs to be a way to say what color space you are using, the color space you are rendering needs to be supported by the frameworks. You need to be able to tell which one you are doing.
13:26:46 [Leonard]
q
13:26:54 [yonet]
...for headsets like vision pro, head room might change
13:27:05 [cabanier]
q+
13:28:11 [yonet]
...we provide the user list of color space, and the developer can tell us which color space they are rendering. There might be some reason they might do headspace
13:28:23 [Leonard]
QUEUE?
13:28:26 [yonet]
...I think we will have to draft something
13:28:49 [yonet]
ack Leonard
13:30:25 [ada]
Leonard: I think this may be resolved, but is this about colourspaces or HDR?
13:30:25 [ada]
Brandel: This is more sRGB vs P# which is also HDR
13:31:32 [ada]
Leonard: These calculations are done in Linear colourspace and then would need to go through tonemapping to go the displays.
13:31:33 [gombos]
gombos has joined #immersive-web
13:32:16 [yonet]
ack cabanier
13:32:17 [ada]
Leonard: You Emmet Liash from Google is an expert in this field.
13:34:20 [Leonard]
References: P3 - https://en.wikipedia.org/wiki/DCI-P3; sRGB - https://en.wikipedia.org/wiki/SRGB
13:34:20 [eeeps]
The HDR on the Web breakout session (https://www.w3.org/events/meetings/009a5b81-0459-4ae4-9b33-f88dd9a9d89f/) was good and a few of the issues discussed there seem relevant
13:35:38 [Leonard]
Different color spaces have a major impact on ensuring consistency of display results between different devices
13:36:34 [ada]
cabanier: @ada re the headroom we need to expose, WebXR is based on a canvas the developer instantiates before the sessio nstarts
13:37:11 [ada]
ada: the canvas context needs to know the details of the devie rather the computer so the information eeds to be present on the XR session
13:38:30 [ada]
cabanier: the canvas already needs to have a way to retarget itself to what ever the display is what ever they do with that we can treat the canvas context the same on the headset.
13:38:53 [yonet]
q?
13:39:18 [bialpio]
Related, 2d canvas has a way to specify color space: https://chromestatus.com/feature/5807007661555712
13:40:18 [ada]
need to get the list of supported depth formats for the system compositor so that the developer can choose the correct.
13:40:46 [yonet]
rik: I am not sure if we support floating points
13:42:19 [yonet]
ada: it seems like we are going to be blocked. on the hdr meeting they showed us all required. 3 thinks were blockers
13:42:45 [Leonard]
q+
13:42:51 [yonet]
ack Leonard
13:43:28 [yonet]
Leonard: are we going to discuss this on our regular meetings? I want to make sure the experts will join
13:43:50 [yonet]
ada: I will bring the topic back, otherwise the issue is going to get worse with more devices
13:44:20 [yonet]
Leonard: the color space is the issue, different than hdr
13:44:45 [yonet]
ada: I guess only hdr is dependent on buffers
13:45:02 [yonet]
rik: I don't think we have a problem currently about the color space
13:45:14 [yonet]
ada: we should at least expose prefer color spaces
13:45:27 [yonet]
rik: you mean, prefered texture formats
13:45:30 [bialpio]
q+
13:45:54 [yonet]
ada: color space
13:46:24 [yonet]
ack bialpio
13:46:28 [ada]
Leonard: is this the final output or the colour space physically based rendering is done in
13:46:35 [ada]
Brandel: it's the tonemapping, the final output
13:47:07 [yonet]
ada: I was pretty sure threejs is using linear these days and then convert to srgb at the end
13:48:35 [yonet]
bialpio: opengl assumes srgb. You all the math in linear space, and then turn it into srgb. We might be confused about some of this.
13:49:03 [yonet]
...webgl and opengl will always assume linear
13:50:18 [yonet]
ada: does correct colors won't work in gl
13:50:18 [yonet]
rik: yes
13:50:52 [yonet]
q?
13:50:52 [yonet]
rik: we have to do texture formats
13:51:24 [yonet]
ada: down the road, we will have to display everything to the user.
13:51:57 [Leonard]
Do you really want to commit to WebGL as it is [slowly] dropping out in favour of WebGPU?
13:51:57 [yonet]
...if you have 3 headsets connected to a computer they each will have a different profile
13:53:30 [cabanier]
scribenick: cabanier
13:53:30 [Leonard]
TY all - great meeting. I need to leave now.
13:54:05 [yonet]
Thanks Leonard
13:54:05 [cabanier]
topic: Automated testing of WebXR sessions with WebDriver
13:54:10 [cabanier]
ada: there's a w3c group wrt the webdriver
13:54:25 [cabanier]
... it would be great if webxr could use webdriver
13:54:25 [alcooper]
q+
13:54:42 [cabanier]
... I think it would be useful for developer building complext website
13:54:50 [cabanier]
... but also in depth testing for webxr
13:55:02 [cabanier]
... I was wondering if people think our group should do
13:55:15 [cabanier]
... and then submit it to the group that manager webdriver
13:55:20 [yonet]
ack alcooper
13:55:30 [vicki]
vicki has joined #immersive-web
13:55:36 [cabanier]
alcooper: we have the webxr test api
13:55:44 [cabanier]
... which chrome uses to drive tests
13:55:54 [cabanier]
... I believe Apple contributed to that as well
13:55:56 [bialpio]
WebXR Test API: https://immersive-web.github.io/webxr-test-api/
13:56:06 [cabanier]
... maybe more could be done there
13:56:11 [cabanier]
ada: that's great.
13:56:20 [cabanier]
... maybe that could be the basis for this type of work
13:56:32 [cabanier]
... that sounds great
13:56:39 [bialpio]
q+
13:56:46 [cabanier]
... should the group be involved in this
13:56:51 [yonet]
ack bialpio
13:56:56 [cabanier]
bialpio: do we know what is expected here?
13:57:06 [cabanier]
... we already have the web platform tests
13:57:12 [cabanier]
... it mostly mocks some data
13:57:22 [cabanier]
... depending on what is needed, it's sufficient
13:57:39 [cabanier]
... in chromium, you can't take advantage of this outside of web platform tests
13:57:54 [cabanier]
... we don't initialize the components all the time. It only exists in our tests
13:57:59 [alcooper]
q+
13:58:24 [cabanier]
ada: what I'd like to see happen, is that I can put my hmd in the computer and then run my tests
13:58:42 [cabanier]
... and then dump it to a file, generate screenshots, diff them , etc
13:59:01 [cabanier]
... so if I make a change, everything still works as expected
13:59:16 [cabanier]
... it's not just mocking, I want to actually drive the hardware
13:59:28 [cabanier]
bialpio: you can send triangles through the mock api
13:59:42 [cabanier]
... we have an implementation that drives this
13:59:56 [cabanier]
... you can't control ref space, input sources
14:00:10 [cabanier]
... so you could write a script that moves you around
14:00:32 [cabanier]
... related to plugging in the headset, ARCore allows us to record an AR session
14:00:43 [cabanier]
... this is how we do testing internally
14:00:51 [cabanier]
... check how the api is working
14:01:13 [cabanier]
... If you go to session recording, it's very backend specific
14:01:37 [cabanier]
... we don't something special, just feed the precorded session to the application
14:01:45 [cabanier]
... the test API might be sufficient
14:01:54 [cabanier]
ada: is this API defined?
14:02:05 [cabanier]
bialpio: yes, Alex and Manish are the editors
14:02:24 [cabanier]
... for all the APIs we ship, we've been updating the data
14:02:42 [yonet]
ack alcooper
14:02:42 [cabanier]
... we have a javascript implementation that implements the runtime
14:02:55 [cabanier]
alcooper: the goal of the test API is to serve as a fake device
14:03:09 [cabanier]
... if you want to inspect bits of state
14:03:18 [cabanier]
... it drives and controls the state
14:03:39 [cabanier]
... you need a special build of chrome. We only do it as a polyfill
14:04:33 [bialpio]
q+
14:04:33 [cabanier]
... some of our tests, do rely on generic webdriver functionality
14:04:42 [cabanier]
ada: it sounds that we want to build on top of this
14:05:07 [cabanier]
... we don't want to have users wear the headset while the tests run
14:05:13 [yonet]
ack bialpio
14:05:15 [cabanier]
alcooper: we don't ship any of that
14:05:28 [cabanier]
bialpio: just to clarify, it's not really a polyfill
14:05:41 [cabanier]
... because the native code still runs
14:06:00 [cabanier]
... we talk to a javascript backend. Just not arcore
14:06:30 [cabanier]
... I'm looking at the spec and it looks like we need to relax it if we want it to run on shipped builds
14:07:03 [cabanier]
ada: if we go down the selenium group, we want to say that it references this api
14:07:08 [cabanier]
... but that it can ship in browser
14:07:24 [cabanier]
alcooper: I believe Manish implemented this in servo
14:07:43 [cabanier]
... I think it's fair to discuss a separate interface
14:07:56 [cabanier]
... that can control a generic device
14:08:41 [cabanier]
ada: a developer could use the test api withouth selenium without us needing to make changes there
14:09:12 [cabanier]
... I'm glad we had this conversation
14:12:46 [myles]
myles has joined #immersive-web
14:12:59 [atsushi]
rrsagent, publish minutes
14:13:01 [RRSAgent]
I have made the request to generate https://www.w3.org/2023/09/15-immersive-web-minutes.html atsushi
14:14:11 [atsushi]
i/Leonard: I think this may be resolved, but is/scribe+ ada/
14:14:19 [atsushi]
rrsagent, publish minutes
14:14:20 [RRSAgent]
I have made the request to generate https://www.w3.org/2023/09/15-immersive-web-minutes.html atsushi
15:05:07 [eeeps]
eeeps has joined #immersive-web
15:34:09 [dino]
dino has joined #immersive-web
16:33:06 [Zakim]
Zakim has left #immersive-web
17:34:06 [dino]
dino has joined #immersive-web