IRC log of webmachinelearning on 2024-04-18

Timestamps are in UTC.

13:57:46 [RRSAgent]
RRSAgent has joined #webmachinelearning
13:57:50 [RRSAgent]
logging to https://www.w3.org/2024/04/18-webmachinelearning-irc
13:57:50 [Zakim]
RRSAgent, make logs Public
13:57:51 [anssik]
Meeting: WebML WG Teleconference – 18 April 2024
13:57:51 [Zakim]
please title this meeting ("meeting: ..."), anssik
13:57:51 [RafaelCintron]
RafaelCintron has joined #webmachinelearning
13:57:52 [anssik]
Chair: Anssi
13:57:55 [anssik]
Agenda: https://github.com/webmachinelearning/meetings/blob/main/telcons/2024-04-18-wg-agenda.md
13:57:56 [jsbell]
jsbell has joined #webmachinelearning
13:57:58 [anssik]
Scribe: Anssi
13:58:02 [anssik]
scribeNick: anssik
13:58:08 [anssik]
gb, this is webmachinelearning/webnn
13:58:08 [gb]
anssik, OK.
13:58:14 [anssik]
Present+ Anssi_Kostiainen
13:58:23 [anssik]
Present+ Rafael_Cintron
13:58:32 [anssik]
Present+ Zoltan_Kis
13:58:40 [anssik]
Present+ Joshua_Bell
13:59:03 [McCool]
McCool has joined #webmachinelearning
13:59:18 [anssik]
Present+ Michael_McCool
13:59:34 [anssik]
RRSAgent, draft minutes
13:59:35 [RRSAgent]
I have made the request to generate https://www.w3.org/2024/04/18-webmachinelearning-minutes.html anssik
14:00:20 [anssik]
Present+ Joshua_Lochner
14:00:35 [anssik]
Present+ Dwayne_Robinson
14:01:16 [anssik]
Present+ Austin_Sullivan
14:01:21 [Ningxin_Hu]
Ningxin_Hu has joined #webmachinelearning
14:01:27 [asully]
asully has joined #webmachinelearning
14:01:51 [anssik]
Present+ Deepti_Gandluri
14:02:05 [anssik]
Present+ Ningxin_Hu
14:02:27 [Joshua_Lochner]
Joshua_Lochner has joined #webmachinelearning
14:02:45 [anssik]
anssik: please join me in welcoming our latest WG participant!
14:02:57 [anssik]
... - Kenji Baheux from Google has contributed to the emerging Hybrid AI effort and joins in an official capacity
14:03:30 [anssik]
Topic: Hybrid AI exploration update
14:03:45 [anssik]
anssik: the initial Hybrid AI exploration proposal https://github.com/webmachinelearning/proposals/issues/5 received interest from multiple folks (thanks!)
14:03:45 [gb]
https://github.com/webmachinelearning/proposals/issues/5 -> Issue 5 Hybrid AI Exploration (by grgustaf)
14:03:59 [anssik]
anssik: to allow for a more structure discussion, I'd propose we create a "webmachinelearning/hybrid-ai" repo owned by the WebML Community Group
14:04:22 [anssik]
... this allows us split discussion into multiple issues, so interested folks can discuss important topics such as privacy implications of shared caches in dedicated issues
14:04:55 [anssik]
... The WebML CG (!= WG) is chartered to produce "Non-Normative Reports [...] that are not Specifications, for instance use cases, requirements, or white papers" so the Hybrid AI discussion would fall within that scope, that includes explainer-style docs
14:05:14 [anssik]
... participation across the WG and CG has a good overlap, active contributors are encouraged to join both the group
14:05:33 [anssik]
... possible new spec incubations informed by the Hybrid AI discussion will requires us to recharter the CG
14:05:41 [anssik]
... we will get to that if/when timely
14:05:51 [anssik]
-> Web Machine Learning Community Group Charter https://webmachinelearning.github.io/charter/
14:05:56 [anssik]
-> Instructions for join the CG (and/or the WG) https://webmachinelearning.github.io/community/#join
14:06:06 [anssik]
anssik: any questions or comments?
14:06:37 [anssik]
Michael: splitting discussion into multiple issues sounds good to me
14:07:11 [anssik]
Topic: WebNN W3C Candidate Recommendation Snapshot published
14:07:21 [anssik]
anssik: A new WebNN Candidate Recommendation Snapshot was published 11 April 2024.
14:07:25 [anssik]
-> Web Neural Network API W3C Candidate Recommendation Snapshot, 11 April 2024 https://www.w3.org/TR/2024/CR-webnn-20240411/
14:07:39 [anssik]
anssik: Congrats to the WG for this milestone!
14:07:46 [anssik]
... this Candidate Recommendation of WebNN API was signal-boosted by W3C's news and social channels and was picked up by some established newsletters such as JavaScript Weekly. Feel free to spread the word.
14:07:56 [anssik]
-> W3C News: Updated Candidate Recommendation: Web Neural Network API https://www.w3.org/news/2024/updated-candidate-recommendation-web-neural-network-api/
14:08:21 [anssik]
anssik: Since the initial Candidate Recommendation Snapshot published 30 March 2023 the group gathered further implementation experience and added new ops and data types needed for well-known transformers to support generative AI use cases
14:08:28 [anssik]
... I'd like to thank the entire group for their contributions
14:08:50 [anssik]
... this publication improved the specification significantly from the initial 2023 CR and benefited from an expanded contributor base and advanced implementation efforts
14:09:18 [anssik]
... our work continues toward the next milestone, and we're currently addressing open issues faster than we open new issues i.e. our bug count is trending downwards, yay!
14:09:45 [anssik]
Topic: NPU support discussion (cont'd)
14:09:50 [anssik]
anssik: issue #623
14:09:51 [gb]
https://github.com/webmachinelearning/webnn/issues/623 -> Issue 623 WebNN should support NPU and QDQ operations (by wchao1115) [v2] [opset] [feature request]
14:10:10 [anssik]
anssik: I'd like to follow up on the NPU support discussion from our last meeting
14:10:41 [McCool]
q+
14:10:52 [anssik]
... three things: 1) NPU device type, 2) fallback device concept, 3) ops for quantized models
14:10:58 [anssik]
q?
14:11:02 [anssik]
ack McCool
14:11:26 [anssik]
Michael: quantization representation may be separate issue can be used outside NPUs too
14:11:58 [anssik]
anssik: quantization was first brough up #128
14:11:59 [gb]
https://github.com/webmachinelearning/webnn/issues/128 -> Issue 128 WebNN should support int8 quantized models (by wchao1115) [v2] [opset] [feature request]
14:13:07 [anssik]
Dwayne: agree with the 3 major topics and also that the quantization support also other device types
14:13:27 [anssik]
... NPU device type is in Chromium and we can test it, it doesn't have a fallback yet
14:13:53 [anssik]
anssik: what is the failure path?
14:14:00 [anssik]
Dwayne: at context creation time
14:14:41 [anssik]
anssik: is the Chromium implementation blocked on some spec discussions?
14:15:08 [anssik]
Dwayne: not blocked on the spec at the moment, but insights from other platforms would help, e.g. CoreML
14:15:50 [asully]
q+
14:16:06 [anssik]
ack asully
14:16:25 [anssik]
Austin: on CoreML there's always a fallback, CPU is always included no matter what
14:17:01 [anssik]
... WebNN API fails if there's no NPU present, and if there's NPU present it tries to create an will fail at build
14:17:17 [anssik]
... without fallback or polyfill pushing the failure forward would be helpful
14:17:46 [anssik]
... another relevant thing re CoreML, adding conv2dint and integer-focused ops, on CoreML everything is fp16
14:18:25 [anssik]
... an accelerator with support for data types varies across systems, conv2dint on Mac needs to emulated on CPU on userspace in Chromium implementation
14:18:55 [anssik]
q?
14:20:20 [anssik]
anssik: anything around ops for quantized models?
14:21:07 [anssik]
Michael: should consider e.g. 4-bit adapters in this context
14:21:39 [anssik]
Dwayne: is this a block-based compression for weights?
14:21:51 [anssik]
Michael: a moving target what to support, noting overlap with adapters
14:22:09 [anssik]
q?
14:22:36 [anssik]
q?
14:23:12 [anssik]
Topic: Open issues and PRs
14:23:19 [jsbell]
q+
14:23:19 [anssik]
anssik: As usual, we'll discuss open issues and review PRs based on your feedback and progress:
14:23:25 [anssik]
-> All open issues https://github.com/webmachinelearning/webnn/issues
14:23:29 [anssik]
-> All open pull requests https://github.com/webmachinelearning/webnn/pulls
14:23:39 [anssik]
Subtopic: Debrief on PRs merged recently
14:23:55 [anssik]
anssik: first, a large number of issues I put on the agenda a week ago were addressed between now and then. I'm impressed.
14:24:09 [anssik]
... massive Thank You to Josh, Ningxin, Dwayne, Austin, others who worked on these!
14:24:16 [anssik]
... the following were addressed:
14:24:24 [anssik]
... - issue #634 fixed by PR #639
14:24:25 [gb]
https://github.com/webmachinelearning/webnn/issues/634 -> CLOSED Issue 634 `MLContext.compute()` explicitly rejects promises where the sub-septs may currently throw (by huningxin) [bug]
14:24:25 [gb]
https://github.com/webmachinelearning/webnn/pull/639 -> MERGED Pull Request 639 BugFix: `compute()` explicitly rejects a promise if buffer transferring fails (by huningxin)
14:24:34 [anssik]
... - issue #610 fixed by PR #641
14:24:35 [gb]
https://github.com/webmachinelearning/webnn/pull/641 -> MERGED Pull Request 641 Introduce "valid dimension", used as needed when calculating operand shapes (by inexorabletash)
14:24:35 [gb]
https://github.com/webmachinelearning/webnn/issues/610 -> CLOSED Issue 610 Need clarify scale factor for resample2d (by BruceDai) [bug]
14:24:42 [anssik]
... - issue #602 fixed by PR #622
14:24:43 [gb]
https://github.com/webmachinelearning/webnn/pull/622 -> MERGED Pull Request 622 Revise graph resource validation (by inexorabletash)
14:24:43 [gb]
https://github.com/webmachinelearning/webnn/issues/602 -> CLOSED Issue 602 Is "validate graph resources" backwards? (by inexorabletash) [question]
14:24:56 [anssik]
... - issues #484 #486 fixed by PR #642
14:24:57 [gb]
https://github.com/webmachinelearning/webnn/pull/642 -> MERGED Pull Request 642 gather(): Address indices validation and other algorithm nits (by inexorabletash)
14:24:57 [gb]
https://github.com/webmachinelearning/webnn/issues/486 -> CLOSED Issue 486 Add "implementation consideration" about how out-of-bound indices of Gather should be handled (by huningxin) [operator specific]
14:24:57 [gb]
https://github.com/webmachinelearning/webnn/issues/484 -> CLOSED Issue 484 Should Gather's indices data type be integers and support negative value? (by huningxin) [operator specific]
14:25:04 [anssik]
... - issue #209 fixed by PR #637
14:25:04 [gb]
https://github.com/webmachinelearning/webnn/pull/637 -> MERGED Pull Request 637 Decompositions for reduceLogSum, reduceLogSumExp, and reduceSumSquare (by inexorabletash)
14:25:04 [gb]
https://github.com/webmachinelearning/webnn/issues/209 -> CLOSED Issue 209 reduceLogSum, reduceLogSumExp, reduceSumSquare are not supported on OV/NNAPI (by mingmingtasd) [editorial] [operator specific]
14:25:16 [anssik]
... - issue #630 fixed by PR #631
14:25:16 [gb]
https://github.com/webmachinelearning/webnn/pull/631 -> MERGED Pull Request 631 Validate restriction of output padding in convTranspose2d (by inexorabletash)
14:25:16 [gb]
https://github.com/webmachinelearning/webnn/issues/630 -> CLOSED Issue 630 Need to add the value restriction of output padding in convTranspose2d (by mei1127) [operator specific]
14:25:20 [anssik]
... an agenda diff for these: https://github.com/webmachinelearning/meetings/commit/904edb388f2c091ee1f80fd8cbac6af54ea0a1eb
14:25:40 [anssik]
And a few more PRs that were not on the agenda, were also addressed after our last meeting:
14:25:53 [anssik]
... - issue #643 partially fixed by PR #283 (PR is pre-work)
14:25:53 [gb]
https://github.com/webmachinelearning/webnn/pull/643 -> MERGED Pull Request 643 Fix/simplify some validation steps (by inexorabletash)
14:25:53 [gb]
https://github.com/webmachinelearning/webnn/issues/283 -> Issue 283 Specify the operand data type constraints of operation (by huningxin) [question]
14:26:00 [anssik]
... - issue #615 fixed by PR #632
14:26:01 [gb]
https://github.com/webmachinelearning/webnn/pull/632 -> MERGED Pull Request 632 add a note for empty input (by philloooo)
14:26:01 [gb]
https://github.com/webmachinelearning/webnn/issues/615 -> CLOSED Issue 615 Graph with no input (by philloooo) [question]
14:26:09 [anssik]
... - issue #187 fixed by #638
14:26:10 [gb]
https://github.com/webmachinelearning/webnn/pull/638 -> MERGED Pull Request 638 Editorial: Make "generically emulated" text a macro, update wording (by inexorabletash)
14:26:10 [gb]
https://github.com/webmachinelearning/webnn/issues/187 -> CLOSED Issue 187 BatchNormalization should be an optional operation (by pyu10055) [opset]
14:26:20 [anssik]
... - PR #640 merged to align with newly added WebIDL Float16Array
14:26:21 [gb]
https://github.com/webmachinelearning/webnn/pull/640 -> MERGED Pull Request 640 Float16Array has landed in Web IDL - remove workarounds (by inexorabletash)
14:27:53 [anssik]
Joshua: PRs to highlight #637 adds decomposition, #638 adds macros for emulated text, #642 makes changes to gather
14:28:28 [anssik]
... common to these is the text associated with decompositions, worded as hints to implementers, if a backend does not support clamping then use these decompositions as a hint
14:28:54 [anssik]
... a sudden thing going through all these PRs, eventually we may run in cases where these can't be handled in userspace
14:29:10 [Ningxin_Hu]
gelu op has also been added: https://github.com/webmachinelearning/webnn/pull/628
14:29:11 [gb]
https://github.com/webmachinelearning/webnn/pull/628 -> MERGED Pull Request 628 Define Gelu operation (by mingmingtasd)
14:31:11 [anssik]
Subtopic: [bug] Synchronously validate input operands/activations
14:31:16 [anssik]
anssik: issue #572
14:31:17 [gb]
https://github.com/webmachinelearning/webnn/issues/572 -> Issue 572 Synchronously validate input operands/activations (by inexorabletash) [bug] [question]
14:31:32 [anssik]
... gentle nudge, I believe feedback and PRs are still welcome for the proposed remaining work
14:31:41 [anssik]
... Josh, anything you want to add?
14:32:02 [anssik]
jsbell: no change since last time
14:32:19 [anssik]
Subtopic: [bug] WebIDL definition for constant(start, end, step, type) is missing
14:32:30 [anssik]
anssik: issue #571 (related to #492 discussed later)
14:32:32 [gb]
https://github.com/webmachinelearning/webnn/issues/571 -> Issue 571 WebIDL definition for constant(start, end, step, type) is missing (by inexorabletash) [bug] [question] [operator specific]
14:32:32 [gb]
https://github.com/webmachinelearning/webnn/issues/492 -> Issue 492 Constant sequential filling operation needs output shape parameter (by fdwr) [feature request] [operator specific]
14:32:40 [anssik]
... a proposal for adding the missing Web IDL definition for constant
14:32:52 [anssik]
... Ningxin suggested we consider this together with int64 (bigint), see issue https://github.com/webmachinelearning/webnn/issues/492#issuecomment-1926419539
14:32:53 [gb]
https://github.com/webmachinelearning/webnn/issues/492 -> Issue 492 Constant sequential filling operation needs output shape parameter (by fdwr) [feature request] [operator specific]
14:33:02 [anssik]
... so let us visit #492 for a second:
14:33:18 [anssik]
... Josh comments bigint use in Chromium is limited and binding codegen is not yet handling that
14:33:26 [anssik]
... also notes Web IDL spec has a warning for union of bigint and a numeric type
14:33:34 [anssik]
-> Web IDL warning: a union of bigint and a numeric type https://webidl.spec.whatwg.org/#limit-bigint-numeric-unions
14:33:46 [anssik]
... Dwayne opened a new Web IDL issue https://github.com/whatwg/webidl/issues/1388 and Web IDL spec editor was convinced the proposal to use a union of bigint and a numeric type "seems reasonable"
14:33:46 [gb]
https://github.com/whatwg/webidl/issues/1388 -> Issue 1388 Intent to use BigInt/numeric union in WebNN (by fdwr)
14:34:04 [anssik]
... are we going to miss something if we merge this issue #571 into issue #492 ?
14:34:37 [jsbell]
Agreed, just merge them. I'll make it a dupe
14:34:53 [anssik]
Dwayne: I don't think we'll miss anything if we close this
14:35:18 [Ningxin_Hu]
+1 to merge them
14:35:31 [anssik]
anssik: OK to close this when useful information is transferred from this issue to #492
14:35:33 [anssik]
q?
14:35:37 [anssik]
ack jsbell
14:35:39 [jsbell]
q-
14:36:06 [anssik]
Subtopic: [question] Allow no-op graphs?
14:36:11 [anssik]
anssik: issue #614
14:36:12 [gb]
https://github.com/webmachinelearning/webnn/issues/614 -> Issue 614 Allow no-op graphs? (by inexorabletash) [question]
14:36:21 [anssik]
... on our last call Ningxin suggested we track this as part of MLBuffer proposal
14:36:27 [anssik]
-> Discussion from our last call https://www.w3.org/2024/04/04-webmachinelearning-minutes.html#t07
14:36:33 [anssik]
anssik: any new information to add?
14:36:46 [anssik]
... Austin you commented this recently in context of fillSequence https://github.com/webmachinelearning/webnn/issues/492#issuecomment-2058063741
14:36:47 [gb]
https://github.com/webmachinelearning/webnn/issues/492 -> Issue 492 Constant sequential filling operation needs output shape parameter (by fdwr) [feature request] [operator specific]
14:36:54 [anssik]
... do you want to talk to your question related to fillSequence rationale?
14:37:54 [anssik]
Austin: the context is I was looking at fillSequence and realized we can't imply the end, MLBuffer could be passed a constant so this may not be required, my understanding of MLBuffer is we can get the same behavior with it
14:38:04 [anssik]
... this op was proposed before MLBuffer existed
14:38:05 [anssik]
q?
14:38:18 [anssik]
Dwayne: thanks for the detailed feedback Austin
14:38:48 [anssik]
... I'd be inclined to remote it
14:38:59 [anssik]
... WebNN I want to be explicit with its output shape
14:39:33 [anssik]
... I will get back to the thread today
14:39:34 [anssik]
q?
14:39:36 [Ningxin_Hu]
q+
14:39:42 [anssik]
ack Ningxin_Hu
14:40:24 [anssik]
Ningxin_Hu: want to clarify that for no-op graphs is to share constant between graphs, and MLBuffer satisfies that use case, not aware of other use cases for this feature
14:40:57 [anssik]
q?
14:41:16 [anssik]
Subtopic: [question] Can an MLGraphBuilder be reused?
14:41:21 [anssik]
anssik: issue #567
14:41:21 [gb]
https://github.com/webmachinelearning/webnn/issues/567 -> Issue 567 Can an MLGraphBuilder be reused? (by reillyeon) [question]
14:41:34 [anssik]
... an issue opened by Reilly, discussed last time, but he was not able to participate then but relayed this message:
14:41:39 [anssik]
"I'm satisfied that there are good use cases for constructing multiple MLGraphs from an MLGraphBuilder but we need example code and implementation experience before we can decide specifically how it will work"
14:42:04 [asully]
q+
14:42:10 [anssik]
anssik: I'm assuming the group's position is the same i.e. we want to explore more with example code and/or implementations?
14:42:11 [anssik]
q?
14:42:14 [anssik]
ack asully
14:42:44 [anssik]
asully: I can try to speak for Reilly a bit, he requested multiple graphs can be built at the same time
14:43:40 [anssik]
... when do we know when we can free up memory we've copied? every build call needs to pass constant weights
14:43:53 [anssik]
... no time when you can release the memory, you don't know when to call build again
14:44:30 [anssik]
... if you can ensure you can call build once or several times at once, if you can expire MLGraphBuilder you can free extra memory copies
14:44:45 [anssik]
Dwayne: should MLGraphBuilder be stateful so it owns the things it creates?
14:44:57 [anssik]
Austin: what does the constant method do?
14:45:21 [anssik]
... implies it is stateful, if not then a copy has to be made at some time, at build call?
14:46:17 [anssik]
... if you change the content of that buffer you change what's passed to build, if you want to reuse an ArrayBuffer, calling constant with a buffer and calling it again with the same buffer you'd think you'll pass different buffers, unintuitive for developers
14:46:36 [anssik]
... arguing MLGraphBuffer should be stateful
14:46:47 [anssik]
Dwayne: changing buffer later before build, agree that shouldn't happen
14:47:10 [Ningxin_Hu]
+q
14:47:13 [anssik]
q?
14:47:33 [anssik]
Austin: if we copy the data and MLGraphBuilder is stateful, when we release all its data?
14:47:35 [anssik]
q?
14:47:37 [anssik]
ack Ningxin_Hu
14:48:02 [anssik]
Ningxin_Hu: I think Austin made a good point, not intuitive for a developer if the data in the buffer can be changed
14:48:21 [anssik]
... when copying data, I see a use case for two graphs sharing the constant when building the graph
14:48:33 [anssik]
... valuable if the buffer is copied at constant build time
14:48:46 [anssik]
... it is used for GPU at build time, same copy of a buffer can be copied
14:48:58 [RafaelCintron]
q+
14:49:06 [anssik]
... if using MLBuffer and constant we can keep one copy of the data and share with different graphs
14:49:24 [anssik]
... one constant can be shared with different graphs
14:49:51 [anssik]
Austin: MLBuffer.destroy() is explicit, with big buffer want to clean up memory that's expensive
14:50:09 [anssik]
... with one copy we can pass the same copy everywhere, not how we implement it, but can optimize implementation
14:50:26 [anssik]
... the answer is whenever the operator associated with GraphBuilder is GC'd, answer now is never
14:50:37 [anssik]
ack RafaelCintron
14:51:04 [anssik]
RafaelCintron: thanks Austin, if we hook up MLBuffer in the future to be a constant is another when to copy problem
14:51:28 [anssik]
... I want MLBuffer here and there and change them, call build and change them again, when are the changes reflected in graphs?
14:51:42 [anssik]
... I think once you call build things should be locked in place
14:52:02 [anssik]
... I can see it being more intuitive that way, not feeling strongly about that
14:52:31 [anssik]
... builder could be too big a consume too memory, could introduce destroy to it, also to context too to release GPU memory
14:52:54 [anssik]
... if associated with high power device could dial down power usage, should shy away from doing that
14:52:54 [anssik]
q?
14:53:27 [anssik]
Austin: regarding MLBuffer we haven't specified the timeline how things happen, good questions when should the data become static?
14:53:33 [anssik]
... when build is called?
14:53:53 [anssik]
... would that create a copy, whatever we decide I expect it to be well defined
14:53:54 [anssik]
q?
14:54:23 [anssik]
RafaelCintron: constant could say "I own it", or "you own it" and "I own it" has performance implications
14:54:41 [anssik]
... we can gain implementation experience on which way to go
14:54:44 [anssik]
Austin: makes sense
14:54:44 [anssik]
q?
14:55:19 [anssik]
Subtopic: [question] Specify the operand data type constraints of operation
14:55:23 [jsbell]
q+
14:55:23 [anssik]
anssik: issue #283
14:55:24 [gb]
https://github.com/webmachinelearning/webnn/issues/283 -> Issue 283 Specify the operand data type constraints of operation (by huningxin) [question]
14:55:32 [jsbell]
https://github.com/webmachinelearning/webnn/pull/646 -> Pull Request 646 Specify the operand data type constraints of operations
14:55:33 [gb]
https://github.com/webmachinelearning/webnn/pull/646 -> Pull Request 646 Specify the operand data type constraints of operations (by inexorabletash)
14:55:51 [anssik]
... Ningxin reported initially (in 2022): "The current spec doesn't specify the operand type constraints of an operation. However, some operations, e.g., softmax should only support float32 operand type according to the survey of frameworks and native ML APIs"
14:56:01 [anssik]
... fast forward to 2023/24, Ningxin provided an extensive summary of the operand data type constraints for current WebNN ops:
14:56:13 [anssik]
-> https://github.com/webmachinelearning/webnn/issues/283#issuecomment-1818387521
14:56:13 [gb]
https://github.com/webmachinelearning/webnn/issues/283 -> Issue 283 Specify the operand data type constraints of operation (by huningxin) [question]
14:56:48 [anssik]
jsbell: PR #646 follows what we do in the rest of the spec, if the data type is not an allowed type, throw an error
14:56:49 [gb]
https://github.com/webmachinelearning/webnn/pull/646 -> Pull Request 646 Specify the operand data type constraints of operations (by inexorabletash)
14:57:25 [anssik]
jsbell: do we want to be table-driven with this information?
14:57:30 [anssik]
q?
14:57:31 [Ningxin_Hu]
+1 to inline
14:57:33 [anssik]
ack jsbell
14:58:12 [anssik]
anssik: Yajing provided comparison with CoreML: https://github.com/webmachinelearning/webnn/issues/283#issuecomment-2050283218
14:58:37 [anssik]
anssik: a non-blocking question from Jiewei: "Should mixed precision be allowed when the op involves accumulation?"
15:00:14 [anssik]
Subtopic: [operator specific] Type of some parameters should match the input data type
15:00:26 [anssik]
anssik: issue #442
15:00:26 [gb]
https://github.com/webmachinelearning/webnn/issues/442 -> Issue 442 Type of some parameters should match the input data type (by Honry) [operator specific]
15:00:26 [anssik]
... Wanming reports MLPadOptions for pad and clamp should match the input data type
15:00:31 [anssik]
... a proposal from Josh in the issue is to add a new typedef:
15:00:36 [anssik]
typedef (bigint or unrestricted double) MLNumber;
15:00:47 [anssik]
anssik: this typedef to be used for:
15:00:47 [anssik]
... - constant(value, type)
15:00:47 [anssik]
... - constant(start, end, step, type) (see #571 and #492)
15:00:47 [anssik]
... - MLClampOptions
15:00:47 [anssik]
... - MLPadOptions
15:00:47 [gb]
https://github.com/webmachinelearning/webnn/issues/492 -> Issue 492 Constant sequential filling operation needs output shape parameter (by fdwr) [feature request] [operator specific]
15:00:48 [gb]
https://github.com/webmachinelearning/webnn/issues/571 -> CLOSED Issue 571 WebIDL definition for constant(start, end, step, type) is missing (by inexorabletash) [bug] [duplicate] [question] [operator specific]
15:00:52 [Ningxin_Hu]
+1 to have a PR for MLNumber
15:01:05 [anssik]
q?
15:01:11 [anssik]
RRSAgent, draft minutes
15:01:12 [RRSAgent]
I have made the request to generate https://www.w3.org/2024/04/18-webmachinelearning-minutes.html anssik
15:05:25 [anssik]
s/structure/structured
15:06:17 [anssik]
s/both the group/both the groups
15:06:32 [anssik]
s/requires/require
15:09:09 [anssik]
s/an will/and will
15:09:42 [anssik]
s/needs to/needs to be
15:27:43 [anssik]
RRSAgent, draft minutes
15:27:45 [RRSAgent]
I have made the request to generate https://www.w3.org/2024/04/18-webmachinelearning-minutes.html anssik
17:03:14 [Zakim]
Zakim has left #webmachinelearning
18:36:55 [\join_subline]
\join_subline has joined #webmachinelearning
19:31:37 [\join_su1line]
\join_su1line has joined #webmachinelearning