IRC log of immersive-web on 2020-11-10

Timestamps are in UTC.

20:02:08 [RRSAgent]
RRSAgent has joined #immersive-web
20:02:08 [RRSAgent]
logging to https://www.w3.org/2020/11/10-immersive-web-irc
20:02:12 [cwilso]
present+
20:02:16 [cwilso]
present+ Leonard
20:02:27 [cwilso]
present+ Brandon
20:02:32 [cwilso]
present+ Manish
20:02:43 [klausw]
present+
20:02:47 [dino]
present+
20:05:53 [atsushi]
meeting: 2020-11-10- Immersive Web Community Group Teleconference
20:06:01 [atsushi]
date: 10 Nov 2020
20:06:06 [atsushi]
agenda: https://github.com/immersive-web/administrivia/blob/main/meetings/cg/2020-11-10-Immersive_Web_Community_Group_Teleconference-agenda.md
20:06:22 [atsushi]
previous minutes: https://www.w3.org/2020/09/22-immersive-web-minutes.html
20:06:26 [atsushi]
zakim, clear agenda
20:06:26 [Zakim]
agenda cleared
20:06:33 [atsushi]
agenda+ administrivia#142 Next Telecon is Performance Telecon
20:06:43 [atsushi]
chair: Ada
20:06:50 [atsushi]
rrsagent, please make log public
20:06:55 [atsushi]
rrsagent, publish minutes v2
20:06:55 [RRSAgent]
I have made the request to generate https://www.w3.org/2020/11/10-immersive-web-minutes.html atsushi
20:08:58 [cabanier]
cabanier has joined #immersive-web
20:09:03 [cabanier]
present+
20:09:12 [ada]
scribenick: ada
20:09:14 [yonet]
yonet has joined #immersive-web
20:09:25 [yonet]
present+
20:09:41 [yonet]
chair: Aysegul Yonet
20:10:28 [ada]
bajones: were going to carry on from tpac
20:11:14 [ada]
The 7th point is about stabilisation planes which is hardware specific, do we have someone from hololens or magic leap to talk about whether they still need this support.
20:11:20 [jgilbert]
jgilbert has joined #immersive-web
20:11:26 [ada]
everyone: silence
20:11:35 [ada]
yonet: I don't think they are here
20:11:44 [ada]
bajones: we'll ignore that issue then
20:11:52 [bajones]
bajones has joined #Immersive-web
20:11:54 [ada]
... so this will mostly be a discussion about timing measurement
20:11:57 [bajones]
present+
20:12:45 [yonet]
https://github.com/immersive-web/performance-improvements/issues
20:12:53 [bajones]
https://github.com/immersive-web/performance-improvements/issues/5
20:13:11 [ada]
This issue is about finding a meaningful metric that is accessible accross all platforms and developers will be able to make use of
20:13:29 [klausw]
q+
20:13:37 [ada]
we do have somethign that comes close, which is Klaus's viewport stats from the viewport scaling
20:14:23 [ada]
bajones: this is probably less like producing a number, instead there will be some heuristic which will report to you that if you made your viewport smaller you are likely to reach the perf target
20:14:45 [ada]
... maybe klaus can tell us whether this is performing well for anyone, but does seem to be a good approach.
20:14:54 [atsushi]
i|bajones: were going to carry on from tpac|topic: Performance improvements https://github.com/immersive-web/performance-improvements/issues
20:15:02 [ada]
... to take a piece of the api and treat it as a knob which can be turned up or down
20:15:13 [yonet]
ack klausw
20:15:47 [ada]
klausw: this is currently being used for model viewer where it is working well. It's mainly for the use case where you get close to a complex model it automatically drops resolution to maintain framerate
20:16:29 [ada]
... i've been usign GPU usage as a guide, <1 is good since not using all GPU ==1 means you are saturating GPU, this is a metric that could be useful if other platforms are willing ot expose it.
20:16:48 [ada]
... having a rough estimate for whether you are gpu bound sounds useful.
20:16:50 [ada]
q+
20:17:35 [ada]
bajones:my assessment of this issue is that I am not seeing a tonne of numbers we could expose to help developers but I think approaches like the viewport scaling is a good pattern to follow.
20:17:41 [cabanier]
q+
20:17:54 [klausw]
q+
20:18:46 [atsushi]
rrsagent, publish minutes v2
20:18:46 [RRSAgent]
I have made the request to generate https://www.w3.org/2020/11/10-immersive-web-minutes.html atsushi
20:18:58 [yonet]
ack ada
20:19:52 [bajones]
https://github.com/immersive-web/performance-improvements/issues/6
20:20:05 [ada]
ada: gpusaturation number potentially very fingerprinting sensitive
20:20:17 [yonet]
ack cabanier
20:20:20 [ada]
cabanier: there is also the foveation parameter
20:20:37 [ada]
... which will let the render spend less time around the periphery.
20:20:57 [ada]
... shouldn't this be a useful thing for all WebGL things not just WebXR
20:21:16 [ada]
bajones:i agree if it's a platform wide metric the platform should probably be the one to do them
20:21:29 [ada]
q+ to see if we can find out why they haven't been done already
20:21:52 [yonet]
ack klausw
20:22:35 [ada]
klausw: perhaps termporal granularity will make it not as useful
20:23:12 [jgilbert]
q+
20:23:23 [ada]
... some apps will want a way to set the preference for how they want the app to tradeoff resolution vs responsiveness.
20:23:24 [yonet]
ack ada
20:23:24 [Zakim]
ada, you wanted to see if we can find out why they haven't been done already
20:24:18 [bajones]
Ada can you mute again?
20:24:18 [yonet]
ack jgilbert
20:24:31 [ada]
q+ we should ask WebGL people why perf metrics haven't been implemented yet, what are the blockers.
20:24:49 [ada]
ada: we should ask WebGL people why perf metrics haven't been implemented yet, what are the blockers.
20:25:43 [ada]
jgilbert: In WebGL we've been expecting users to monitor their render times and compare it to RAF to do perf themselves.
20:26:27 [ada]
... knobs are handy for people with simple experiences but for advanced ones we should provide numerical feedback to be more useful
20:28:11 [klausw]
q+ to say a device might think it's perfectly hitting 60fps though the actual framerate is 120fps
20:28:12 [ada]
bajones: it's important to do these things in a privcy preserving manner, we tell users to measure their RAF timings to see if they are missing it. FOr VR deices users may not know what timings they need to hit, since different devices run a different frame rates, sometimes even inside the same device.
20:28:30 [ada]
... it would be great to tell users when they are reachign and when they are missing framerate
20:28:50 [ada]
... we could tell them, "you need to reach 72hz"
20:29:32 [ada]
... i do recall the only consistent number we could get is how many frames the compositor missed i.e. You only supplied 2 of the last 5 frames
20:29:48 [ada]
... this feels like somethign that could be surfaced without much privacy concern
20:30:12 [ada]
... since you can get there any way with hacks
20:30:28 [jgilbert]
q+
20:31:22 [yonet]
ack ada
20:31:39 [yonet]
klausw+
20:31:48 [yonet]
ack klausw
20:31:48 [Zakim]
klausw, you wanted to say a device might think it's perfectly hitting 60fps though the actual framerate is 120fps
20:32:05 [ada]
klausw: i like the idea of the dropped frame counter but then we need a signal to know to increase the complexity again
20:32:24 [cabanier]
q+
20:32:30 [ada]
... when the device is being under utilised.
20:32:37 [yonet]
ack jeff
20:32:53 [yonet]
ack jgilbert
20:33:09 [ada]
jgilbert: i wanted to mention for missed frame times with variable refresh rate they don't get missed
20:33:38 [ada]
q+ to ask if that makes sense for headsets
20:33:51 [ada]
q-
20:34:22 [ada]
bajones: you don't get Variable Refresh Rate in XR. They might try at 90 but if they fail they will lock at 60
20:35:03 [yonet]
ack cabanier
20:35:31 [ada]
cabanier: the magic leap is a Variable Rate device since it will invent frames using the depth buffer
20:36:47 [ada]
... if it is swap chain based and the frame is still resolving complex things may happen to give poor performance
20:37:33 [ada]
bajones: we want to see if we should do this here or push for something for the whole platform
20:37:50 [jgilbert]
q+
20:38:40 [ada]
... we could do something faster here, and there is a higher privacy barrier for using it so has fewer privacy concerns but a platofmr wide one could benefit everyone. Even then WebXR content though may need a specific use case.
20:39:02 [ada]
... we could build something here and if it works present it to the wider web, i am not sure the best approach for doing that.
20:39:17 [ada]
... for timing info it's probably best exposed on the XR frame.
20:39:45 [yonet]
ack jgilbert
20:40:30 [ada]
jgilbert: brandon has a good point that we need to spererate desktop vs headset. An alternative would be getting target frame time during the raf callback.
20:41:11 [ada]
bajones: i believe RequestIdleCallback gives you idle time
20:41:24 [ada]
... something like this could be equivalent
20:41:44 [klausw]
q+
20:42:04 [yonet]
ack klausw
20:42:12 [ada]
... it would be a fine line to walk i.e. you have 8ms doesn't tell the frame rate but would be related.
20:42:23 [jgilbert]
q+
20:42:38 [ada]
klausw: we need to also make it clear to developers that this includes render time and JS time
20:42:53 [yonet]
ack jgilbert
20:43:06 [ada]
jgilbert: we don't even know those numbers in the useragents today
20:43:44 [ada]
... telling the user what the framerate is might give the best bang for the buck
20:44:12 [yonet]
https://github.com/immersive-web/performance-improvements/issues/7
20:44:29 [ada]
ada: topic stabilisation planes
20:45:15 [RafaelCintron]
RafaelCintron has joined #immersive-web
20:45:21 [RafaelCintron]
q+
20:45:26 [ada]
bajones: this was an old issue and priorities may have changed is this something we still want to think about?
20:46:09 [yonet]
ack RafaelCintron
20:46:22 [ada]
Lachlan: i'm not sure whether it is in OpenXR
20:46:56 [alexturn]
alexturn has joined #immersive-web
20:46:59 [ada]
RafaelCintron: the one we have been encouraged to use is the depth buffer stabilisatiing
20:47:03 [alexturn]
present+
20:47:07 [alexturn]
q+
20:47:38 [yonet]
ack alexturn
20:47:59 [ada]
bajones: since we have a prefered way of capturing depth ifnromation perhaps we should shut this issue down and refer to that.
20:48:52 [ada]
alexturn: for the hololens 2 submitting a depth frame is the most recommended way to do it, for developers to implement this writing a homemade version is practically impossible.
20:49:13 [ada]
So definitely just submitting the depth buffer is the way to go.
20:50:28 [klausw]
q+
20:51:22 [ada]
cabanier: people are running into VM issues where WebXR is creating issues by creating lots of garbage and causing problems with garbage collection
20:51:57 [ada]
bajones: can you publish the graphs of those objects so we can look as a group?
20:52:32 [alexturn]
q+
20:53:20 [ada]
... it would be great to take them back to the interface wizards who tell us to ignore GC and show how it is causing us problems, and we can move more GC friendly APIs in the future. Manish's work at alternative APIs is a godo example. Or we can take it back to the browser developer to see if they can fix it.
20:53:30 [yonet]
ack klausw
20:53:49 [ada]
jgilbert: the audio groups have been looking at these kinds of issues.
20:53:51 [klausw]
dynamic viewport scaling: https://github.com/immersive-web/webxr/issues/1091
20:54:17 [yonet]
ack alexturn
20:54:17 [ada]
klausw: does anyone have any intent to implement dynamic viewport scaling?
20:55:12 [ada]
alexturn: for the GC stuff I would be curious to see the profiles to see if they come from the medium objects which last a while since most objects should last exactly one frame then be cleaned up
20:55:35 [ada]
ada: what is gen zero?
20:56:23 [ada]
the GC is designed to efficiently clear stuff which is short lived and stays in gen 0, gen 1 is for longer lasting stuff.
20:56:41 [ada]
s/^/alexturn:/
20:57:09 [ada]
bajones: if Gen 0 isn't pulling the weight here we need to know why.
21:15:28 [atsushi]
rrsagent, publish minutes v2
21:15:28 [RRSAgent]
I have made the request to generate https://www.w3.org/2020/11/10-immersive-web-minutes.html atsushi
22:04:30 [atsushi]
previous meeting: https://www.w3.org/2020/09/22-immersive-web-minutes.html
22:04:37 [atsushi]
s|s/^/alexturn:/||
22:04:48 [atsushi]
s/the GC is designed/alexturn: the GC is designed/
22:05:14 [atsushi]
s/The 7th point is about/... The 7th point is about/
22:06:06 [atsushi]
i|This issue is about finding a meaningful|topic: Provide GPU timing info to content https://github.com/immersive-web/performance-improvements/issues/5
22:06:10 [atsushi]
rrsagent, publish minutes v2
22:06:10 [RRSAgent]
I have made the request to generate https://www.w3.org/2020/11/10-immersive-web-minutes.html atsushi
22:07:30 [atsushi]
s/This issue is about finding a meaningful/bajones: This issue is about finding a meaningful
22:07:59 [atsushi]
s/we do have somethign that comes close/... we do have somethign that comes close/
22:11:47 [atsushi]
i/bajones: This issue is about finding a meaningful/topic: Need a way to determine whether rendering is falling below target frame rate #6
22:11:49 [atsushi]
rrsagent, publish minutes v2
22:11:49 [RRSAgent]
I have made the request to generate https://www.w3.org/2020/11/10-immersive-web-minutes.html atsushi
22:12:19 [atsushi]
s|topic: Provide GPU timing info to content https://github.com/immersive-web/performance-improvements/issues/5||
22:12:50 [atsushi]
s|topic: Performance improvements https://github.com/immersive-web/performance-improvements/issues|topic: Provide GPU timing info to content #5|
22:12:52 [atsushi]
rrsagent, publish minutes v2
22:12:52 [RRSAgent]
I have made the request to generate https://www.w3.org/2020/11/10-immersive-web-minutes.html atsushi
22:14:27 [atsushi]
i/ada: topic stabilisation planes/topic: Add support for an optional stabilization plane #7
22:14:29 [atsushi]
rrsagent, publish minutes v2
22:14:29 [RRSAgent]
I have made the request to generate https://www.w3.org/2020/11/10-immersive-web-minutes.html atsushi
22:15:17 [atsushi]
s/So definitely just submitting/... So definitely just submitting/
22:15:19 [atsushi]
rrsagent, publish minutes v2
22:15:19 [RRSAgent]
I have made the request to generate https://www.w3.org/2020/11/10-immersive-web-minutes.html atsushi
22:16:12 [atsushi]
(fin