Meeting minutes
Guests and W3C Patent Policy
W3C Patent Policy
McCool: by joining this call, we assume everybody agrees to the W3C patent policy above
McCool: today's call is organized for the IG side, though
McCool: (quickly shows the agenda of the vF2F this week and next)
... IG: joint on March 15
… joint sessions with liaison orgs today
… WG: Discovery on March 17
… Use Cases on March 18
… IG: Architecture and Profiles on March 22
Lagally: includes some use cases discussion with ITU-T as well on March 22
McCool: ok
… (adds that to the slides)
… any other changes?
(none)
… (shows today's agenda)
scribes
Kaz: will take notes for the first part
… need one more
McCool: volunteer?
Ege: will do
McCool: presenters, please send your slides to me
… will install them on GitHub
(Sebastian gives all reminder on the IRC)
WoT summary and status
McCool's slides
McCool: because we have many guests today
… would like to summarize what WoT is like
… [W3C Web of Things (WoT)]
… adapting Web technologies to IoT
… we're on our 2nd Charter
… TD metadata format is already done, and big chunk is for this charter includes discovery
… [WoT Descriptive Interoperability]
… WoT Architecture and WoT Thing Description
… affordances are abstract layer to handle devices
… [Usage Patterns Overview]
… vertical use cases based on the industry needs
… [WoT Orchestrations]
… multiple things there to be combined with each other
… opensource implementation based on Node-RED named node-gen
… and node-wot for Scripting API
… including discovery capability
… will be mentioned during the plugfest report
… [Current WoT WG Charter Work Items]
… a lot of topics
… [Current Status]
… links here
… architecture 1.1 draft
… TD 1.1
… Discovery
… Profiles
… Binding Templates
… Scripting API
… new Group page at https://www.w3.org/WoT/
… questions?
(none)
IEC CDD
-> Murayama-san's slides @@@
Hiroshi_Murayama(hm)
Hiroshi: [Common Data Dictionary (CDD) on Parcellized Ontology Model and related standards]
… [CDD in a nutshell]
… Common Data Dictionary
… based on IEC 61360-2, ISO 13584-42
… by IEC SC 3D and ISO TC184/SC4
… evolution of ontology elements
… [CDD in a nutshell 2/2]
… ontology developed over time
… used in system infrastructures
… [Base standards]
… data model: IEC 61360-2, ISO 13584-42, IEC 62656-1 and ISO 13584-35
… exchange formats: ISO 13584-32, IEC 62656-1, IEC 62656-8... doain data dictionaries / ontologies being stored and to be stored: IEC 61987-11, 62683, ISO 13584-501, 13584-511, ISO 13399, ISO 23584
… plant automation, switch gear/control panel, laboratory instruments, ...
… IEC DD has been extended in coverage as well as in its functionality
… current IEC CDD is implemented as a subset of IEC 62656-1
… relational ontology model and evolved from ISO 13584-35
… multi-lingual model
… [Cooperation among TCs/SC in IEC & ISO]
… (diagram on the activity)
… [4-layer ontology data model]
… 1. axiomatic ontology (AO)
… 2. meta ontology (MO)
… 3. domain otology (DO)
… 4. domain library (DL)
… [Every layer of ontology in POM is defined as a set of instances of its upper layer, except the Axiomatic Ontology layer at the top]
… M4-M3 for AO M3-M2 for MO, M2-M1 for DO, M1-M0 for DL
… DL is provided by each company
… [Reference mechanism in IEC61360(PLIB) & IEC62656(POM)]
… structure of the ontologies
… class ID -> class -> translated text -> property -> property ID
… relationship between the elements over time to be maintained
… [Class needs to point eactly the specific version, when ontology evolves over time]
… [Reference mechanism in POM/CDD]
… ICID defined in IEC 62656-1 (IO 13584-35) adds separator1 (default "#")
… and separator 2 (default "##") which may be redefined
… [Detail of the RAI]
… [IEC CDD as online DB]
IEC 61360
Hiroshi: [Domain standard development with IEC CDD]
… actual definition is stored within the CDD
… actual content is maintained and updated on the ontology side
… [Concept of Class, Property, and Relation]
… [Concept of class in CDD/POM]
… Class C = {X| P1(x1)} ^ P2(x2) ^ ... ^ Pn(xn)}
… [Concept of class and property in CDD/POM]
… [Future? COR on POM as an extended CDD]
… (diagram)
… (this is inline with DPPC final report)
… Common Ontology Repository (COR)
… [Collaboration and cooperation]
… ISO TC184/SC4 and IEC...
… [Summary]
… CDD implemented opnsource
McCool: tx!
… time check
… ok to have discussion within 10mins
… if we need to add anything
… need to understand concrete applications
… we're aiming for finalizing the procedure
… point for connection should be Thing Description and Thing Model
… any existing method to be adapted based on RDF?
Hiroshi: ontology overtime
McCool: additional capability might be version management
… sounds like maybe not completely
… we do support SPARQL
… important to figure out how to access CDD
… maybe search over versions
Sebastian: thanks a lot first
… wondering you use the term "Ontology"
… assuming it's based on RDF
Hiroshi: RDF is just format for the ontology
Sebastian: tools to discover the actual definition?
… what kind of format to be exported?
Hiroshi: mostly XML is used
… JSON is also possible
… there is a list of exposed format
Lagally: many thanks
… online DB
… how many entries?
… how many types?
Hiroshi: more than 20 ontologies
… and maybe 10 thousand entries?
Lagally: what is the typical target areas?
Hiroshi: semiconductor, etc.
… but it's just a part
yoshiaki_sonoda(ys)
Yoshiaki: let me provide some information
… oil, gas, power, etc., is included
… more types of dictionary for ISO community
… originally oil and gas but recently extended for electric power
… imported CDD
… one of the purposes of the open CDD DB
… we'll have more sort of dictionaries (for broader vocabularies)
Hiroshi: (shows the diagram again)
… [Cooperation among TCs/SCs in IEC & ISO]
Lagally: engineering purposes?
Hiroshi: including engineering
Kaz: tx
… think we should continue collaboration on how to refer to the data defined CDD from the WoT side
McCool: should work on 2 levels
… 1. short-term for this Charter period
… and 2. longer-term for the future work
… long-term liaison to be considered for the next Charter too
… btw, wondering about the IEC CDD and ECLASS's work
McCool: does ECLASS depends on the IEC work?
Block: IEC CDD is data model
… and basis of ECLASS
… ECLASS defined data as part of CDD
<mjk> Can someone please send me the teleconference coordinates?
Block: IEC work is industry-independent
Block: ECLASS develops its own JSON model as well
Sebastian: so was wondering about RDF export of CDD
Sebastian: if there is an RDF version, could be integrated with Thing Description easily
McCool: would like to focus on the next step
Kaz: we should continue more collaborative discussion to reuse CDD data for WoT purposes
… maybe we can do that via official liaison but should have more discussion first
McCool: right
… should have more discussion to define the goal
… how to proceed?
Kaz: if it's OK by the IEC side, we can establish/extend a simple liaison for that purpose
Sebastian: would see use cases as well
McCool: makes sense
… we have vertical/horizontal use cases
… concretely and logistically, we hold use cases call biweekly
Lagally: we'll have use cases session during the vF2F as well on Thursday, March 18
McCool: note that this is the virtual F2F call this week and next
… we hold regular calls biweekly as well
McCool: (shows the use cases session agenda on March 18)
<McCool> https://www.w3.org/WoT/IG/wiki/Main_WoT_WebConf
McCool: or you can join the regular Use Cases call
… when will be the next call?
Lagally: in April but should confirm
Kaz: would like to continue the discussion with Murayama-san and Sonoda-san about further collaboration
Hiroshi: btw, please note that IEC and ECLASS are completely different organizations
McCool: got it
Block: can provide information about ECLASS as well
McCool: ok
(Murayama-san and Sonoda-san leave)
[10min break]
WebThings
-> Ben's slides tbd
Ben: (introduces himself)
… [WebThings Gateway]
… backend with NodeJS
Ben: gateways is a producer and consumer at the same time
… to consume TDs and re expose them to the internet securely
… The exposed TDs can be made W3C compliant quite easily
… consuming is different
… thanks Siemens for contracting Cristiano to make the gateway TD compliant
… we have drafted a plan
… I have created a PR for the repo to add the slides
… these are the links for issues
<sebastian> find all issues here: https://github.com/WebThingsIO/gateway/labels/w3c-compliance
Ben: I have also submitted feedback for the profile TF
… also this week I have reviewed the discovery spec and I have provided feedback
McCool: we had some work on the additional and error, did you check that?
Ben: yes I have been following those discussions
… error and additional responses are in a good direction since they are needed for describing the full API
McCool: maybe there should be a testbed for these dynamic resources
Ben: TDs cannot describe collections of resources
<kaz> ek: wondering how the data consumption is done
Ben: making the WebThings W3C compliant is the question here
… (shows the WebThings Gateway diagram again)
… adapters communicate on the backends
… the individual adapter communicates with the backend
Ege: if I have a Zigbee device, should use some specific API?
Ben: you can use some Thing Description for that purpose
Ege: in that case, why there is compatible problem there?
Ben: good question
<mjk> We have a TM which was converted from the OneDM model for a Zigbee Level cluster?
Ben: the gateway may bridges some local network and the Internet
Ben: so the gateway consumes WebThings in the network (via Thing URL adapter) and this follows the WebThing API. This requires each adapter to adapt to this
Sebastian: I am very happy to see that Webthings will be also using TDs
... what do you think is the best way to approach the 3rd party library developers
... for the python I know someone who can have a look
Daniel: can the third party libraries stay behind?
Ben: They can but this will be a breaking change and we want all users of the gateway to upgrade timely
Kaz: thank you for considering compatibility with TDs
... we should work on a best practices kind of guideline together and clarify which entity uses what data models and interfaces
McCool: thank you for the presentation and guiding this
T2TRG
<McCool> Agenda
<McCool> Minutes
McCool: sadly Ari, Carsten or others from the group could not join this call
... the main outcome was Koster's working on converting ASDF <-> TD converter
... (shows website of IETF meeting on ASDF)
... I did a WoT update session
... milan milenkovic is working on an IoT landscape document
... trying to put a table with a taxonomy
... I need to get together with Koster and Milan to continue on this work
Koster: you can have a TM for a zigbee device that doesn't have network addresses
<sebastian> sdf-2-TM tool: https://github.com/roman-kravtsov/sdf-object-converter
McCool: there is another paper under review about iot edge challenges
... next on bootstrapping terminology
Secure IoT Bootstrapping: A Survey
McCool: I gave feedback on removing some words like "Configuration" from the IoT bootstrapping analysis
Koster: we have finalized ASDF and have draft 5 which is a stable release for implementations
OneDM repo
Koster: we can create TMs from SDF definitions
McCool: for ben, maybe you can take TMs instead of TDs for integrating into the gateway
Sebastian: regarding edge discussion, was there any discussion on edgex foundry?
McCool: that is a completely different project, like webthing
... it allows integrating Things through a gateway
Ben: I need to look at onedm
... Have you tried zigbee definitions with physical devices?
... there is another project called "zigbee2mqtt" and the community did an amazing job of classifying zigbee devices
Koster: onedm will help with defining new zigbee devices
... bluetooth is also onboard
Ben: we have capability schemas on our repository
Koster: we were lagging behind with oneM2M
... regarding iotschema.org, we talked with Dan Brickley. They will not be merged. They want to focus on datatypes and not interested in affordances
Kaz: after onedm bringing sdf idea to ietf, is this repository still used by onedm?
Koster: mk: we will also have our CI tool so that you can use it in your own repos
echonet
Matsuda-san's slides
Tetsushi Matsuda (tm)
Tetsushi: I am representing Hirahara-san
... who is the liaison contact from ECHONET
... each device is described via properties
... actions are done as classes via setting a property
(tm shows the structure of the ECHONET Lite Web API Guidelines)
Tetsushi: devices are modeled with property, action and event as in WoT
... there are read status, control and notication funcitons
... notify function is like observable in WoT
... you can define a group of devices and client can set them with one API call
... the cloud can do the individual requests
... it is also possible to implement historical data in servers. The client cannot ask for the recording to begin but can read historical data
... we also have guidelines for the API such as HTTPS, HTTP 1.1, json, utf8
... For INF (notification) it is desireable to support push notification mechanisms
... echonet specifies websocket, MQTT and longpoll
<McCool> ok
Tetsushi: differences with WoT
… syntax, ...
McCool: would extend the meeting by 5 mins to finalize the discussion
… interested in the "grouping" functionality
Sebastian: thank you for the nice presentation and comparison to WoT. how do you realize eventing
Tetsushi: we use longpolling, websocket, mqtt and webhook to implement eventing
<Ege> https://github.com/w3c/wot-thing-description/issues/892 regarding historical data
Sebastian: is there no form data in the device description?
<McCool> (right, so it seems the URLs etc. are implied by the property name, etc)
Tetsushi: since we support only http, we simplify the device description and do not include binding templates
<Zakim> kaz, you wanted to ask if "getting the list of devices" is kind of discovery and to ask if there is any specific API to group multiple devices and to ask what the data model for history is like
Kaz: wanted to ask (1) if "getting the list of devices" is kind of discovery and (2) if there is any specific API to group multiple devices
<McCool> (so discovery starts with the hub URL, where you can get list of devices, and also devices by type, but there is not a general content-based query mechanism)
Tetsushi: regarding grouping, the client can define a group of devices with some expected actions
<Zakim> dape, you wanted to bulk actions revoced if one action fails?
Daniel: regarding the href, a WoT thing can choose to that kind of hierarchy but does not have to
Tetsushi: if the transaction succeeded, you'll get the new property value back from the device side
(=equal to the set property itself)
Daniel: regarding the bulk operation, what happens if one does not turn on? do you notice it somehow?
... what if we turn on several lamps at once and part of them failed?
... it is best effort
... the client can identify if the action failed
<Ege> https://github.com/w3c/wot-thing-description/issues/892
Tetsushi: the client can get a history object
Ege: question on hitory
<McCool> (we have to wrap up though... so I'm going to have to close the queue and ask for next steps)
McCool: would like to have a follow-up call, e.g., during the Use Cases calls
… can you join the Use Cases call?
Tetsushi: yes
… but the liaison procedure on the ECHONET side is still being discussed
Kaz: will talk with them again about that
McCool: ok
Sebastian: also would be great if ECHONET can join the upcoming Plugfests
McCool: yeah
… that would be in October
Ege: wondering if there is any API publicly exposed?
Tetsushi: no
… maybe could provide something for the WoT guys for Plugfests
… but need confirmation from the ECHONET org
Wrapup
Slides
McCool: Discovery on Wed
[adjourned]
Meeting minutes
Scribe
Sebastian for the 1st part
Daniel for the 2nd part
Introductions - Toumura
Slides
Toumura: gives an overview about the 2 phase architecture
… there are 5 mechanisms for discovery like a CoRE Resource Directory, DNS, Direct, well-Known URI, Decentralized Identifier
Toumura: compared to last vF2F we not introduced significant changes
… update the type ussage in CoRE Link
… and few editorial changes
<Zakim> dape, you wanted to 1st phase talks about TD of thing or TD of directory. Isn't there a 3rd category (missing)? List of TDs of things?
Daniel: Back to slide 2, I'm wondering if there should be another category like list of TDs?
McCool: we'd like to be as simple as possible
… typically you get one URL
McCool: we should discuss about the well-known approach about this
Ben: What is the difference between direct and the other methods?
McCool: all provide a URL
Sebastian: How about RFID or QR-Code scan?
McCool: will also provide an URL
Toumura: we need implementations of the different approaches
… so far no implementation of the DID approach
McCool: What kind of implementation WebThing has?
Ben: we have implemented mDNS
<kaz> kaz: would confirm implementations by RIOT OS
Exploration Mechanism - Farshid
<FarshidT> Please refer to my last F2F presention for draft API design decisions: https://github.com/w3c/wot/blob/main/PRESENTATIONS/2020-10-online-f2f/2020-10-20-WoT-F2F-Discovery-DirectoryAPI-Tavakolizadeh.pdf
<mjk> mDNS multicast DNS, and DNS-SD is Service Discovery using DNS (or mDNS)
Farshid: shows Figure 4 in discovery spec
… there two TD classes that are called directory description and link description
… link description uses the link approach with rel=describedBy
Lagally: Is the support TM?
McCool: not yet. A timing issue.
… maybe in the 2nd version
<citrullin> pb: We already implemented that in RIOT. The rt CoRE topic. Even though we haven't tested that yet @@@ (to be moved later)
McCool: for the Directory Description the consumer has to follow the links
<kaz> wot-discovery - 6. Exploration Mechanisms
Farshid: if there are fedorated DD then should be one DD
Ben: is it possible to have a public directory?
… how about local devices?
McCool: needs security consideration and needs clearify this in this spec. I will create a issue about this
Farshid: self description is new in the spec
<kaz> 6.1 Self-description
Farshid: is an exploration mechanism in which a Thing hosts its own TD
… TDs can be also provided in partial manner (e.g., for constrained devices)
<kaz> wot-discovery - Issue 132 - Peer-to-Peer Queries Endpoint in Producers
Farshid: there is also a way to query TD elements
sebastian: In profile discussion it would be good that a consumer can discovery TD with specific TD size
McCool: would be good to cover this. I will provide an issue
Philip: The range is only for HTTP. We want to cover other protocols in the future
<kaz> Range HTTP request header
<citrullin> FT: I cover this topic later in my presentation.
Farshid: shows the information model of the directory
<kaz> 6.2 Directory
Farshid: based on a TD https://w3c.github.io/wot-discovery/#directory-thing-description
… shows the different operation like createTD, updateTD, deleteTD, etc
<McCool> (I created an issue about allowing HTTP in some cases: https://github.com/w3c/wot-discovery/issues/139)
<kaz> Farshid goes through Example 3
<kaz> 6.2.2.1.5 Listing from the preview of PR 130
Farshid: the last part of the dirctory section is Listing
https://w3c.github.io/wot-discovery/#exploration-directory-api-registration-listing
Farshid: shows an example with content-range
… there is also some news about validation
https://w3c.github.io/wot-discovery/#validation
<kaz> RFC7807 Problem Details for HTTP APIs
<kaz> wot-discovery issues related to Pagination
Ben: I like the approach of pagenation.
… there some discussion what are properties and what are actions
… shall we standardize the interaction names?
… like "createTDs" ...
McCool: recommend we should standardize the TD / TM.
<Zakim> dape, you wanted to 10-12 range seems wrong, should be probably 10-11 (for 12 items)
Ben: disadvantage with current approach is that may multiple interactions needed
McCool: this is right, that the current TD approach is quite long. Maybe we should point to a TM. Simpler TD for constrained devices would be good
Farshid: there are many optional definitions. You do not need to implement all
Lagally: you define status codes, are thos normative?
Farshid: Yes they are normative
Lagally: are we prepare to handle all of them?
Farshid: there are minimum required
Lagally: important for the profile discussion
<kaz> [note the example 3 within this the WoT Discovery spec is informative, the normative codes are defined by RFC7807]
Farshid: we have 40x and 50x codes so far
<kaz> sk: more protocol specific topics
<kaz> ... you have syntactic XPath there as well
McCool: we should have error codes in seperate section
<kaz> ... serialization format for TD is basically JSON-LD
sebastian: wondering about XPath, also working for JSON?
Farshid: its mainly syntactic search with XPath. JSONPath not standardized yet, XPath is just fallback
cris: question about pagenation. this approach should take into account different protocols. Also it might have impacts to the current design of the scripting api.
Farshid: there some discussion in a PR (https://github.com/w3c/wot-discovery/pull/130) about URL approach
<kaz> kaz: having a separate section for error definition is fine, but we should be careful about how to deal with the error definition there because we'd like to import the definitions from RFC7807.
Philipp: I do not see big deal with CoAP. Im worried that your API may kind of complicated to other protocls then HTTP and CoAP.
McCool: currently concentration on HTTP
Zoltan: there is also discovery in scripting API which is not alligned with WoT discovery yet
… scripting is only phase 2 and filtering
<dape> scribenick. Dape
Andrea: [SLIDES] Syntactic discovery in directories
… could be semantic not only (syntactic)
… TD discovery answer: array of TDs
… JSONPath, mandatory
… not standard, but widely used
… e.g., ../jsonpath?query={query}
… XPATH, version 3.1
… supports JSON
… W3C standard
… e.g., ../xpath?query={query}
… XPath more complete than JSONPath
… Pros and Cons
… Pro
… - short and expressive
… - passed as URL
… Cons
… - TD fragments
… - complex queries
… - JSONPath not standard
McCool: JSONPath on path being an IETF standard
Andrea: ... - JSONPath may lead to security
… Conclusion: MUST JSONPAth, SHOULD XPath
Andrea: Semantic Discovery in Directories
… same idea, answer is JSON
… result is same
… SPARQL is optional
… SPARQL is W3C standard
… returns JSON-lD
… support for: SELECT; ASK; CONSTRUCT, and DESCRIBE
… *no* support for: UPDATE
… SPARQL query can be codified as URL
… for GET we need codified, for POST we send it in body (without codified)
… answer is JSON
… ASK used to query whether somethnig exists -> results in boolean
… DESCRIBE & CONSTRUCT returns JSON-LD
… DESCRIBE & CONSTRUCT have a problem... are JSON-LD frame documents
… SPARQL allows us to use query federation
… e.g., specify to forward it
… we need to know endpoints ahead of time
… JSON-LD frames -> translating -> JSON-LD/RDF
… back to JSON-LD frame is not possible. (needs framing rules)
… Pros and Cons for SPARQL
… Pros
… - expressive
… - query language with functions etc (W3C standard)
… - federated queries
… Cons
… - simple queries are more verbose tan JSONPath/XPath
… - consumes more resources
… Conclusion
… SPARQL is optional
… semantic discovery is very flexible
McCool: class-names are visible? And do not match names in spec... cleaning-up?
Andrea: Yes, if they do not match we should align
Sebastian: Compile SPARQL to URL? is there a limitation in size?
Andrea: I am not sure.
… SPARQL standard defines it
Farshid: No limit... but there is response code
Andrea: 100% aligned with SPARQL
Cristiano: Limitation on client-side also?
… e.g. browser can not handle any length
Cristiano: Which is the difference between what we have and SPARQL
Andrea: 1. there is no difference
… 2. the aspect is about framing
… RDF can be returned
… the problem is not SPARQL, we are adding something on top of the standard
Sebastian: Normalized TD. We need to define what we mean by that
… could be Turtle
… information is the same
… form is different
… TD task force topic
Andrea: I created issue w.r.t. that in TD repo
… normalized for me is RDF
… simpler form is (framed) JSON-LD
… problem: framed JSON-LD is not RDF
… we lack framing rules
McCool: API changes required with such a change?
… need framing document
Andrea: mime-type could be used
Cristiano: Could we standardize the process?
Andrea: worked on this translation
… did not find generic algorithmn
<acimmino> https://github.com/w3c/wot-thing-description/issues/1015
Philipp: Url topic, can we stick do POST only?
… length/limits
Andrea: disocvery not mandatory to be used with browser?
McCool: Sometimes we need URL encoding
… suggest to keep it but put note about length
Andrea: user can choose
Lagally: Canonical representation
… we need to seperate the discussion
… look at it in profile spec
… just heads-up
McCool: normalized =!= canonical
… need to clarify what "validation" means.. e.g. SHACL
Andrea: SHACL should not be applied in queries
McCool: need error response (besides JSON validation)
Andrea: SHACL/SHAPE ... different with SAREF
… more ontologies.. more changes
<McCool> https://github.com/w3c/wot-discovery/issues/143
McCool: --> 9 minute break -> 15 past
<kaz> rsagent, draft minutes
<kaz> r
Discovery issues
Slides
Framing and Pagination
McCool: We talked about framing and pagination already
… will update slides with links later
Signing and Canonicalization
McCool: Signing and Canonicalization
… signing to preserve TD integrity
… important for directory
… re-structere TD might break signing
… canonicalized JSON exists
… but it does not deal with default values in TDs
… need to clarify that
… we need canoncialization before signing
… conpect of "enriched TD", e.g., insertion time
… option to omit such data
Lagally: Canonicalization is very important
… preserve original TD
McCool: retain original String is fallback
… receiver needs to check/match signature
… chaining is fine if one trusts directory
… we should prototype Canonicalization
… and validate it
Lagally: is it that complicated?
… provide some rules like defaults
McCool: hard part: roundtripping through databases
Lagally: RDF representation with additional restrictions
McCool: signature on information ideally
Philipp: proxy topic: proxy between oneDM and TD. Assumption that consumer can validate both
Lagally: validate or trust
Philipp: oneDM bridge.. not able to consume TD?
Lagally: Why not?
Philipp: idea to *not* to understand the other protocol
McCool: form different but interactions the same. Sign parts of the TDs?
McCool: bottom line: Canonicalization is useful. Signing needs Canonicalization
… even JSON-LD has no stable solution
… JSON-LD proved got dropped
Farshid: trust directory? TLS usable
Ben: TD created by device serving via HTTPS
McCool: signing allows caching/forwarding
… do not need to trust all parties involved
Farshid: first draft could be limited to TLS
McCool: at the moment we do not have signing
… TLS over local HTTP does not work
… rely on wifi security .. is weak
… should we push for signing or defer to later spec?
Farshid: Definitely useful... but we could live with current version
McCool: Canonicalization should be in TD spec.. not in profile
Validation
McCool: Topic "Validation"
… directory needs to validate TD
… what does that mean
… JSON schema could be used, but JSON schema is not a standard
… proposal: use syntactic validation
… proposal2: semantic validation based on SHACL
… IF JSONschema becomes standard we can replace it
Lagally: baseline is syntactic validation, right?
McCool: Correct, syntactic validation required always
Ben: What does syntactic validation mean?
… what about extension?
McCool: JSON schema allows additionalProperties
… no strong validation
Ege: It only validates terms defined by TD spec
… e.g. validates "forms" but not "form"
McCool: I still hope JSON schema becomes standard
Lagally: Question: JSON schema not standard. Can't we reference the current state?
McCool: JSON path we refer to draft also
… problematic when coming to REC
Sebastian: I think we should not rely on JSON schema becoming a standard
… community accepts this kind of living standard
… JSON schema uses different versions like 0.7
… we can say we stick to *this* version
Lagally: Agree
McCool: same problem with JSONPath
<FarshidT_> There is also specification for validation with JSON Schema: https://json-schema.org/draft/2020-12/json-schema-validation.html
Ege: talked with JSON schema editor. No plan to become standard
Kaz: we should talk to PLH also
Security Bootstrapping
McCool: Topic "Security bootstrapping"
McCool: how to specify authentication for doing exploration
… self-description problem
… issue 135
… 3 options
… 1. default mechanism
… 2. protocol-specific negotation, eg. HTTP headers
… 3. two-phase approach
… TD does not provide this kind of security schema
… e.g. use top level security
Ben: use case webthing gateway, like reboot
McCool: how to authenticate is another key
… we can maybe discuss it offline
… private information to be used as fingerprint is a concern
Farshid: that's not mandatory
Ben: that's fine
McCool: directories may use nosec for the TD as it has no private information; maybe?
… any idea on the two-phase approach?
Ben: same idea as yours
(proposed in the issue 135)
<cris> +1
Farshid: you said that was protocol agnostic
McCool: let's also think about CoAP, etc.
… also bootstrapping for multiple TDs
… think error response is cleaner
tomorrow
Slides
McCool: Use Cases tomorrow on March 18
… summary list for the agenda would be helpful
Sebastian: will generate one
McCool: that's it for today
… thanks a lot for your talks, all!
… further issues to be captures on GitHub
[adjourned]
Meeting minutes
Opening
<kaz> Opening slides
today will mainly discuss use cases; only a 2h session
<kaz> i/Presentations/topic: Preliminary/
Sebastian: mccool to take minutes the first hour, raggett the second hour
Sebastian: (presents opening slides, reviews agenda)
McCool: no guests today, all IG members
Use Cases and Requirements IG Note
Slides
Process
Lagally: start by reviewing process
… then I will present and walk through the current Use Cases and Requirements document
… which we plan to publish in two weeks
… we also have an ITU-T topic, but have to shift to next Monday due to avaliability
… note that we plan to have a resolution to publish on April 7, this is a review of the candidate draft. Has been stable for a while, but seeking input before publication
… there has also been some cleanup, including commenting out sections that were incomplete
Lagally: objective of UC IG TF is to collect use cases and identify requirements, including collecting use cases from other SDOs
… and also collaborate with other TFs on requirements
Lagally: process is as followed
… collect use cases proposals, have a shortlist to filter contributions (although most have been high quality and have been accepted), then identify gaps, etc.
… eventually analysis of use cases should result in a set of requirements for other TFs
Lagally: we don't have a strong formalism on use case description; have focused on getting input rather than being formal
… so want to keep the barrier low
… have a simple MD template, although eventually need HTML; Kaz created a convertor
… github repo is here: https://github.com/w3c/wot-usecases
… under USE-CASES, usecases that have been incorporated into the document are under "processed"
… we also have some guidance as to appropriate terms to use, for instance, to refer to stakeholders
… these are given in the template
Lagally: (walks through template)
… there is both a requirements and a use cases template
… we also need eventually to distinguish between "horizontal" and "vertical" use cases
Lagally: note introduced i8n requirements late, still not much content there
Lagally: driven by contributions
McCool: suggest that we see if we can get wide review... i8n and privacy, although we have talked to accessibility
Kaz: wide review is for normative docs, though it would be useful to get early feedback from them if possible
McCool: understood, but at least we can ask
Sebastian: nice if we can have a todo-list, especially for topics that are not covered yet
… e.g. opc-ua, automotive, etc... so we know what's expected
Lagally: we collect these in issues, I have attached labels
… did try to make a guess at which ones we may be able to finalize
… but we are also dependent on contributions
Lagally: although I just noticed there is no issue
Sebastian: there is a PR though
Lagally: while we are talking PRs, there are a couple that should go into the current document
Lagally: can create OPC-UA issue, can you be owner?
Sebastian: yes, but we may also have other contributors to Shaeffler
… no specific name however
Lagally: note that requirements was moved from architecture to usecases, so if you see an old link to architecture note that they have been moved to use cases
… and will be in that document
McCool: reiterate that the doc is "Use Cases and Requirements" and will be informative
Document status
Lagally: let me review the structure of the actual document
… first talk about application domains and the purpose of collecting use cases from various domains
… by the way, looking at the doc in PR https://github.com/w3c/wot-usecases/pull/114
… this PR removes empty sections left over from template but not filled in; just commented out, however
… some things have been labelled differently or renamed
… also removed affiliation of people with companies, W3C policy clarification need
Kaz: did talk to legal team about affiliations, esp for non-members. first, we can include use case contributions from non-Members into the use cases doc. second, we should not mention the non-Member companies' name within the use case description because authorship of the W3C TR is entitled for the W3C Members. third, however, we still can acknowledge their work mentioning their name and affiliation within the Acknowledge section.
McCool: I think we should raise this with W3M directly and ask to put affilliations in, since the purpose of UCs is to collect external input
… and what if this was considered a CG document?
Kaz: ok to put all the affiliations down in the acknowledgements, however
David: there is a strong connection to market, and while sensitive to issues of company promotion, but in this case the company affiliations are in fact very important for marketing reasons; "in good company"
Lagally: my personal take is that no-one reads appendices
McCool: ok, let's do another round with Wendy S.
<kaz> Preview of PR 114 - 4. Domain specific Use Cases
Lagally: going back to the doc, did not delete anything except a total duplicate section
… also took out some editorial notes about "add something here", instead we have issues for this now
… some of these todo are still in comments, however
McCool: just wanted to clarify that main use case should be simple and concrete and raise a requirement; please put variations under "variants" rather than complicating the main use case
Lagally: should also mention in many cases we just have placeholders for references but the details need to be filled in
… but this is just homework
Lagally: let's now look at the document; note that terminology is pulled entirely from Architecture
McCool: should "stakeholders" be terminlogy?
Lagally: hard to make consistent, and also openended, prefer to keep those terms informal
Lagally: then we have vertical, domain-specific use cases
… then have "horizontal" use cases that span multiple industries
McCool: VR/AR maybe belongs under Smart City, and 5.6.2 maybe belongs under 4.10.2... smart home vertical
Lagally: please make a PR
McCool: ok
Lagally: also "target users" -> "target stakeholders"... everywhere?
Kaz: as one of the authors of the VR/AR use case, I believe this should be kept within the horizontal use cases, though we can add some more additional use cases or descriptions under the domain-specific use cases like smart city./
McCool: propose that we do some cleanup on the VR/AR topic (possibly splitting it up into a vertical and a horizontal use case) AFTER the proposed publication
McCool: maybe we could add horizontal/vertical tags?
Lagally: that's what we had before, would rather not revert at this point
Kaz: really just need a process where we collect use cases and then capture requirements; whether the use case is horizontal or vertical does not matter so much
<kaz> PR 114 Preview - 6. Requirements
<kaz> REQUIREMENTS area on the wot-usecases repo
Lagally: note that under REQUIREMENTS there are a set of additional contributions that have not yet been merged; and the current document is relatively light
McCool: should we be merging some of these? Discovery has been around for a while, for example
Lagally: don't think we can merge this now.
McCool: agree, too much extra content to deal with now
Lagally: also, acknowledgements should not be 6.3 but a new section or appendix; editorial change
proposal: merge PR114 https://pr-preview.s3.amazonaws.com/w3c/wot-usecases/pull/114.html for "Remove template leftovers"
<mlagally_> https://github.com/w3c/wot-usecases/pull/114
proposal: merge PR114 https://github.com/w3c/wot-usecases/pull/114 for "Remove template leftovers"
Resolution: merge PR114 https://github.com/w3c/wot-usecases/pull/114 for "Remove template leftovers"
PR 117
<kaz> PR 117 - Add liaisons
Lagally: there is another pull request that is on top of this one that adds a liaison chapter
Lagally: add a section on liaisons
McCool: we should add at least Conexxus and IETF
Kaz: and also OGC
<dezell> Conexxus is working on a WoT POC demo for our Annual Conference the week of April 26.
Lagally: feel uncomfortable about joint activity in the document
Sebastian: I think just mentioning that there is a liaison
… am fine with the current text, but can also point to existing liaison
… can add link
Lagally: David, can you provide a PR on Conexxus?
<mlagally_> https://github.com/w3c/wot-usecases/pull/117
David: ok, can do
<kaz> Preview of proposed section "7. Liaisons"
Lagally: let's then keep the PR open, work on it
McCool: let's to an email call and point to this PR as well as the main
… but email on the liaison PR can wait until a week before
Kaz: have two questions
… would like to understand main intention of this section; already contributed, or future?
… IEC, ECHONET, ECLASS are in second category
… second, there is an official table but we can mention that, and we should link to other press releases too if we include one for one SDO
McCool: is fairness a problem? If we link to one press release, should be link to them all?
<kaz> W3C liaison table
Lagally: there is an issue for really old things?
McCool: either we point to a table, or have a cutoff date
Lagally: would rather have them in the doc then in a table somewhere else (so cutoff date approach ok)
Sebastian: we need an overview of liaisons... is something that should be on the marketing TF task
Lagally: yes, let's discuss in the marketing call
Issue 111
<kaz> Issue 111 - Automotive Use Case based on VSSo
Lagally: ok, let's look at the issues now
Lagally: automotive use case https://github.com/w3c/wot-usecases/issues/111
… is there someone who can contribute on this?
Sebastian: Daniel Wilms is our contact at BMW
Kaz: Benjamin Klotz from BMW have also been participating in WoT workshops, etc.
Kaz: thought Benjamin was the original author of the VSSo. so maybe he might be working with Daniel.
Lagally: closing issue about deleting sections no one uses
Meeting minutes
Guest
McCool: we have a guest from ITU-T, Gyu Myoung Lee
McCool: skipping the opening slides, guest are notified with the w3c normatives
<kaz> W3C Patent Policy
Agenda for today
<kaz> Agenda wiki
Slides
Lagally: I'm really happy to have participants from ITU-T. Important for our work in wot use cases
Lagally: we should look if there are any gaps in our use case document thanks to the input of ITU-T
Lagally: please observe the queue
Lagally: open discussion about alignment between w3c WoT and ITU followed by architecture implications of ITU-T hub
… should we add anything to the agenda?
… ok
Use Cases - ITU-T
Lagally shows a document that contains a review of ITU-T standards.
<kaz> Results of ITU-T SG20 WoT document analysis
McCool: I focused on framework of the web of things document and the ITU-T WoT service architecture
… main question what is an object?
… hub is referred as broker in the document
… we don't emphasize hubs in our documents. I think we should
… the describes abstract functions that could be mapped to hardware in different ways.
… services are categorized in WoT, Web, and Mash-ups. We don't underlying this differences.
Lagally: you mentioned that ITU-T needs a register service
… currently the WoT discovery is work in progress
… will we have still this gap when WoT discovery is defined?
McCool: well, ITU-T needs the registry at the architectural level.
… discovery is just finding TDs not register them
… and surely not how to manage them
… it is intentionally out of scope in WoT discovery.
Lagally: about he second bullet in your document. is deployement of Scripting API out of scope too?
McCool: well similarly to the previous point.
… it is a gap in the spec really
… like how to provious security parameters (e.g., keys etc.)
Lagally: do we plan future specification of ITU-T document
Gyu_Myoung: yes it is possible in two ways. Small clarification or starting a new process.
Lagally: do you think there is interest to align with WoT specification?
Gyu_Myoung: we did not use Thing Description to describe our services
Lagally: do you mean to create a mapping document between ITU-T and TD ?
… what do you need from w3c side?
Gyu_Myoung: possibly an expert from w3c side and discuss conjunctly an analyses of the two standards.
Lagally: do you have already someone in mind?
Gyu_Myoung: we contacted individual editors and experts
Lagally: ok
… we'll discuss this topic further in the main call.
McCool: from our side there are some missing pieces that we would like to add. possibly at the end of the year we could create a new charter dedicated to them
Kaz: I would start from concrete scenario based on some concrete use case and idendify the building blocks (within the Architecture spec) or entities (within the existing implementations). Then we could understand which piece within the system or subsystems could be mapped to the building blocks from the W3C Architecture..
<kaz> (my point is not generating yet another use case but clarifying the concrete scenario and data transfer, etc., for the existing use cases related to the external standards :)
Lagally: we already have a couple of use cases defined
McCool: echonet and ITU-T WoT look more into smart homes use cases
… having real system is a good place to drive requirements.
Lagally: we should have a follow up conversation
McCool: I suggest also to review the document and check if we have misunderstood something.
Lagally: what about scheduling a call in three weeks from now
… so that we can have a good plan
McCool: it could work, maybe defining homework by email would help
Lagally: let's try to target the week of April 12
McCool: it is probably fine
Lagally: ok let's create a doodle poll to the define the right timing.
Lagally: ok we completed two points of the agenda. Point three could be moved in the next call
<kaz> agree with McCool, and think we should clarify which entity within the use case scenario does what (possibly with some restriction). The entities should include edge devices as Things, gateways as Intermediaries and applications as Consumers
Kaz: which entity in the use case detail is the most important?
… we should state it for each use-case
Action: kaz to create a doodle for the next liaison discussion around April 12
Architecture
Lagally going through the agenda items
Lagally: Any other input for the agenda?
McCool: Can we add Hub vs. P2P? Reason for this is the constrained devices topic.
Lagally adds it to the agenda
Introduction
Lagally starts with the introduction for the newcomers.
<kaz> WoT Architecture 1.1 - Working Repository
<kaz> WoT Architecture 1.1 Editor's draft
Lagally: We have a couple Todos in the document. So, be aware that we are still working on that.
Lagally: We have been focusing on profiles in the recent architecture calls.
Lagally: We have introduced the thing model.
Lagally: There are also several editor notes.
McCool: It probably makes sense to refer to the other documents, so we don't risk contradicting information.
McCool: There is a secion Core profiles, that name should be just profiles.
Lagally: We will have discussions about it and may change it.
McCool: We should introduce another section called architectual pattern or something like that.
<kaz> https://w3c.github.io/wot-architecture/#core-profile 8.3 Core Profile
Lagally: I tried to change our policy a bit.
(Sebastian joins)
Lagally: bf, can I add you to this issue?
Ben: I prefer not to. I don't agree with the specification. I think it shouldn't exist.
<kaz> 10 Example WoT Deployments
<kaz> clarification on network topology, etc., would be useful for liaison discussion, etc.. However, I think
Kaz: I think it doesn't have to be the normative architecture design, but could be part of the deployment scenario section.
Lagally adds a new issue for the architectual pattern topic
Terminology
<kaz> Terminology issues
Lagally: All topics have owners and mm already addressed a lot of topics. Thanks for that.
McCool: My PR solves 3 or 4 of those issues.
New terminology for the Binding document?
Lagally: I would advocate to add it to the terminolgy section.
McCool: Also discovery. I think it is a core thing.
McCool: One question. Are those terms defined in other documents as well?
McCool: Should we redefine them or just refer to TD 1.1?
Ege: The idea is to remove the td context extension.
Lagally: Let's create an issue for it.
Sebastian: You can also add the thing model description.
Sebastian: We should have a single definition.
Lagally: I think we already spent some time to define the thing model in the architecture.
Sebastian: We should check, if it is the same as in the TD document.
Lagally creates an issue for it
Lagally: TD Fragment and partial TD
Lagally: We discussed this and added it to the document.
McCool: I don't disagree with the definition. To be aligned with the JSON specification we should name it JSON element.
Lagally: I think we already discussed this topic.
McCool: I don't think it is the end of the world, it would just be more align with the JSON specification.
Lagally: Have you dealt with the system terminology?
McCool: I haven't yet. It isn't in my PR yet, but I want to take a look into it.
<kaz> PR 582 - WIP: Terminology update
McCool: I want to solve as many terminology issues in my PR as possible. At least the non controversial ones.
Lagally: It would be good to not introduce new terms.
McCool: You are right, some might be contriversal.
Lagally: I am not comfortable with TD Fragment to TD Element.
Lagally: Please take it out and introduce another PR for it.
McCool: Okay, will do.
McCool: We also need to cleanup the confusion with Thing description and Thing description directory.
Lagally: We should have an additional call about this.
<mjk> Digital Twin usually includes modeling of the physical system
<kaz> Issue 530 - Incorporate Discovery terminology into terminology section
<kaz> Issue 581 - Consolidate usage of gateway, edge and hub
(Philippe Coval on IRC wonders about modeling Digital Twins)
(Kaz gives clarification to Philippe that we're currently talking about terminology for including "Digital Twins" but not discussing "Digital Twins" itself. Also suggests we have a separate discussion on Digital Twins during the oridinary WoT Architecture call after the vF2F, and we can invite Philippe to that call.)
(Lagally agrees and Ben transfers that to Philippe on IRC)
McCool: I added a definition for edge device. We probably have to review them.
<kaz> going back to PR 582 again
Lagally added a comment to the PR
<kaz> PR 583 - fix a typo
Lagally: It is just a simple typo. I really would like to merge it.
McCool: Don't let get into the IPR thing and just change it ourself.
Kaz: can merge it after marking it "editorial" using the Repository Manager, if it's really editorial
Lagally: (agrees)
PR 583 merged after marked as editorial
Accessibility
Lagally: This was a topic from the APA meeting.
McCool: I think there is a wider review process.
McCool: I think when we get a more solid specification, we should request a review.
other spec contributions
McCool: As part of IETF, there is a new draft, but I think we might want to take a look into it.
system lifecycle with registration
<McCool> the IETF draft on onboarding and boostrapping: https://datatracker.ietf.org/doc/draft-sarikaya-t2trg-sbootstrapping/?include_text=1
security considerations
Lagally: Do we need to do anything here?
McCool: I created a section for this. Should we put it into the main architecture document?
Lagally: We have a security and privacy considerations section in the specification.
Lagally adds issue 587
<kaz> [5min break; then Profile discussion]
WoT Profile
Lagally: (shows the agenda slide)
… Introduction
… device categories
… constraints
… canonicalization
… discussion on one/multiple profiles
… review/discussion of FPWD feedback
Lagally: anything else?
McCool: should start with the scope of "WoT Profile"
… wouldn't take too much
… related to the topic on one profile or multiple profiles
Lagally: ok
Lagally: (adds "scope of WoT Profiles" to the agenda for today)
McCool: should have discussion on that first
Lagally: ok
Sebastian: would see that we have consensus about "Profile"
… would like to keep it simple
McCool: yeah, that's why wanted to put it as the first topic
WG Charter
Lagally: (explains excerpts from the WoT WG Charter)
WoT WG Charter
McCool: want to say "implementations" here means "finite implementability - a developer needs to know in advance the set of technologies they need to include in their implementation, and this must be a finite set"
… please don't assume context means "vertical"
Sebastian: each IoT product also has this
… not really see if we want to have "Plug-n-Play"
… what if we have no clue on semantics
… we can also narrow the scope to communication, etc.
Lagally: semantic interoperability and semantic PnP would be nice
McCool: actually, the Charter description implies "more than one" profile
Lagally: ok
… we have three profile use cases
<McCool> McCool: just want to point out the charter uses "profiles" in the plural and implicitly assumes there may be more than one
use case draft
Lagally: Use case: multi-vendor system integration out of the box interoperability
5.2 Multi-Vendor System Integration - Out of the box interoperability
Lagally: as a device owner, developer, cloud provider, ...
… the model here is multiple vendors adapt to a standard
… this should be possible without device-specific customization
… Use Case: Cross Protocol Interworking
5.4 Cross Protocol Interworking
Lagally: examples in smart home, smart city, ...
-> Use Case: Digital Twin
5.3 Digital Twin
Lagally: Conclusion in the Architecture call on 21 Jan. 2021
https://www.w3.org/2021/01/21-wot-arch-minutes.html
McCool: we should focus on use cases
… some of them apply WoT in general
… within certain use case, some specific protocol would be applied
… a use case for that purpose is digital twin
… let's just focus on the context first
Lagally: ok
Sebastian: have problem with the use cases for profile discussion
Lagally: let's do some simulation for TD then
… we have to make some basic assumption
Sebastian: don't see the description yet
McCool: digital could apply all the WoT
… digital twin is one context
… we should clarify what context to be used
… the constraints applied to everywhere should be included in the basic specifications, not in a profile that only applies to one context
Kaz: would repeat my point for liaison discussion here :) @@@
Lagally: that's related to device capability
… would see profile requirements with more than supporter
requirements for Profile
Lagally: interoperability, limit and reduce complexity, ambiguities, ...
McCool: some of them might be "nice to have"
… should clarify our actual requirements
<McCool> McCool: some of these are absolute requirement, some are nice-to-haves, some belong in general goals for WoT (eg. eliminate ambiguity)
McCool: would like to describe the issue on the goals and scope
wot-profile issue 73
McCool: (goes through the issue 73)
… we need to think about the context and narrow the scope
… we also should pick up one specific profile
… and would like to propose we start with the hub concept
… we don't worry about P2P interoperability
… having narrower scope would mean we would have more concrete answers
… limited to what we have experience
… let's talk about narrowing the context
… and let's pick one
Lagally: tx for creating this issue 73, first
[[
Assume the hub has a relatively large memory capacity and capability for consuming Thing Descriptions.
Assume endpoints will not, themselves, consume Thing Descriptions.
]]
McCool: we can have a Hub as a Consumer
Ben: tx from me as well
… "hub" as the first profile proposed by McCool here
… included in the Mozilla's Member submission
… but what I want to have is concrete description how to communicate with devices
Lagally: gateway also could have some restriction
… how to handle big TDs in that case?
Ben: actual size of TD should be relevant for housekeeping
Lagally: what do you guarantee how big TD can be handled?
… can safely reject the TD?
… what would happen otherwise?
Ben: you don't have "maximum size" for Web pages. right?
… don't see difference with WoT from that viewpoint
(some more discussions on possible use case settings)
McCool: I used terms of "endpoint" and "hub"
… we assume "consumer" is relatively bigger
Ben: agree
<McCool> McCool: think we should just define "context" as "a set of assumptions"
Cristiano: concern on using a generic concept at the protocol level
… maybe would be better to narrow the scope
Lagally: what kind of payload to be handled could be additional constraints
Sebastian: this is not a real argument
… want to agree with McCool here except concentrating on HTTP, CoAP and MQTT
Lagally: should work on websocket as well?
Sebastian: another possible future protocol as well
… no restriction on possible protocol to be mentioned here
… maybe Ben can work on draft text for that
Ben: can work on it
Lagally: what is the fundamental problem then?
Sebastian: would propose separating the document into (1) technology with HTTP+JSON and (2) others
Lagally: (goes through the section 4)
4. Profiling Mechanism
Ben: would suggest we remove the profile section and concentrate on the protocol binding section
Lagally: protocol binding within the WoT Profile draft is just a placeholder at the moment
5.2 Protocol Binding
Lagally: creating a profile for HTTP+JSON would be helpful, though
… the goal of TD is defining the datamodel
Ben: but the current description withing the "WoT Core Data Model" would add complicity
McCool: main issue is the motivation
… and what is the accomplishment
… need to be clear about what to accomplish
… we need a documentation for developers
… all we need is narrow scope and concrete description
… maybe we could generate a draft using MD and see which part to be applied to the WoT Profile draft
Philipp: I have a hard time to understand why we use outdated protocols. I think we should focus on the current specification of protocols. If we are going with HTTP, it probably makes sense to go with HTTP2 or 3.
<McCool> McCool: let's focus on things for now that we have direct experience with and a clear set of needs
<McCool> ... again, something concrete that we can "get in the can"
<McCool> ... however, I agree with Philipp, there is a need to have an "constrained" focused-profile that perhaps deals with these issues... but we can defer, and I think we have to
Kaz: basically agree with McCool, and would repeat we should clarify our expectations on which entity (Thing, Intermediary or Consumer) does what and has what kind of restriction based on some concrete use case and then clarify our requirements. and then we can see what kind of profile is needed.
Sebastian: still need the definition on what "core" means
Sebastian: we don't to keep less information within TD rather than big text data
… we should keep out a concept of "core" profile
… though could think about some "generic" information
… all the specific profile to be handled separately
<Zakim> dape, you wanted to limiting *known* terms/constructs might not help in all cases
Daniel: there are several layers
… quite crucial to have initial setup
… limiting known terms/constructs would not work in some cases
<citrullin> +1 as well, no one guarantees that people will not exceed it. Also there are still different protocols. The market will eventually find a common ground and some protocols will win, other not.
McCool: we still have different opinions on Profiles
… need to resolve a lot of things to move forward
… we need a follow-up discussion during the regular Profile discussion
… let's start with one specific profile first
Lagally: ok
… agree this direction on issue 73 is right one
… please create Merge Request if you think any part of the current draft is not appropriate
… let's continue the discussion during the next Architecture/Profile calls
Next meeting
Slides
McCool: on Wednesday March 24
Note on how to proceed
Kaz: please note that creating actual PRs for Profile discussion before getting consensus wouldn't make sense
… creating issues would be fine, though
McCool: ok
… let's start with my issue 73 then
[adjourned]
Meeting minutes
opening / agenda
<kaz> Opening slides
<kaz> updated Opening slides
Sebastian: reminder: contribute to use case PRs and issues
… profile discussions in issue #73
… We may need to increase the profiling time. One hour is often not enough.
McCool: there are also a number of discovery issues requiring feedback. Editorial ones may be merged today.
Christian: the listing PR should not be merged yet. It is still under discussion.
Lagally: the next architecture call, after Easter (April 22) will be 2 hours and dedicated to profiling discussion.
<sebastian> Kaz we cannot hear you?
Sebastian: today's agenda will cover:
… What's new in TD 1.1
… Next TD publications
… Thing Model
… TD canonicalization
… News from iotschema.org
Sebastian: the profiling discussions should continue on the issue tracker and in PRs. Three weeks break is a long time.
Kaz: Ben has some long and reasonable comments.
Ben: Can create a PR with the proposal
McCool: Better discuss and resolve open issues before creating a PR. Otherwise the discussion will be moved to the PR.
McCool: It is best to have small PRs with concrete proposals that we can quickly agree on and move on.
<kaz> kaz: we should be careful how to handle the contributions. starting with summarized comments from each contributor on Issue 73 with possible disposition of all the comments would be useful for the discussion on April 22.
What's new in TD 1.1
Slides
Form
<kaz> 5.3.4.2 Form
Sebastian: added (un)observeallproperties type for op
Ege: Is this referring to when "any" property is changed or when "all" properties are changed?
Ege: will the response include all changed properties or only those that have changed?
Zoltan: If the implementation doesn't support an observeall in a single transaction, should it fail or fall back to another op?
Sebastian: if the implementation doesn't support it, then it should not advertise it in the TD.
EventAffordance
<kaz> 5.3.1.5 EventAffordance
Ben: if there is observeall, should there also be subscribe all events?
Ben: this currently exists in WebThings and also the directory spec. We may need an equivalent to this for events. Also, we may need to get past events.
McCool: It makes sense, but need to discuss in a separate issue.
Sebastian: This may be offered by a subprotocol, but adding everything to the top level spec may not be appropriate.
Ege: The op values are not properly explained. It would be useful to add a table (instead of sentences in a paragraph) to explain individual values.
Sebastian: It is a good point.
McCool: It will be useful, but the descriptions should be protocol agnostic.
<kaz> 5.3.1.5 EventAffordance
Link
Sebastian: Next change is the relations table
<kaz> 5.3.4.1 Link
Sebastian: possible values are listed. Mostly based on existing IANA link relation registrations.
McCool: we also have a relation type used in discovery (describedby), which didn't make to this list
… extends is strange in the given context
<victor> my question: can you elaborate on 'controlledBy'?
McCool: proxy-to may be redundant. Need to check if the use is different with the existing proxy relation type.
Victor: What is the purpose of controlledBy relation type?
<victor> ack
Sebastian: controlledBy refers to devices controlled by another Thing.
Ben: is manifest used in the right context?
<mjk> https://html.spec.whatwg.org/multipage/links.html#link-type-manifest
<kaz> i|next change is|subtopic: Link|
McCool: I think the IANA explanation specifies the manifest format. which may not be TD. Need to look into that,
Ben: Will create an issue to discuss the manifest relation type
exclusiveMinimum and exclusiveMaximum
<kaz> 5.3.2.4 NumberSchema
Sebastian: Next change: exlusiveMinimum and exclusiveMaximum
McCool: We need to look into the discussions in SDF. They have been working on these for a while. We should make sure we don't have any contradicting definitions.
AdditionalExpectedResponse
<kaz> 5.3.4.4 AdditionalExpectedResponse
Next change: one or more AdditionalExpectedResponse
… each AdditionalExpectedResponse may have success, contentType. The schema field is currently under discussion and may be removed.
McCool: The success if false by default
… the schema may be moved outside and made reusable
Ben: What is the use case for this addition?
McCool: There could be multiple success responses distinguishable by means other than the content type.
… The existing schema did not allow defining that and also the various error responses.
uri assignment for authentication location
<kaz> 5.3.3.1 SecurityScheme
Sebastian: Next change: URI assignment for authentication location
McCool: there is a PR right now to extend the body to include a JSON pointer into the data schema
Thing Model features
<kaz> 10. Thing Model
Sebastian: concepts, modeling tools
… Modelling tools: versioning, extension and import, placeholder, required
… derivation to TD instances
… the algorithm and process. This is subject to change. The explanation will be improved.
<kaz> s/topic: Additional/subtopic: Additional/
McCool: "required" field isn't in TD, so TD Model cannot be considered as a partial TD.
Ben: TM adds complexity to implementations. The complexity should ideally be shifted to the cloud.
Sebastian: TM is not meant to result in a TD during runtime. It is mostly useful to create TDs during development/configuration/startup time.
Ben: it is not necessary to turn this into a standard. Also, requiring clients to be able to process them add complexity.
Sebastian: Standardizing it makes it easy to reuse models
<Zakim> dape, you wanted to required part of JSON properties
Daniel: The inheritance may need to be reworked. Some thing are restricted in TD but allowed in TM. We need to improve this to simplify validations.
… ideally, we should start from an abstract model and define other things on top.
Lagally: We should maintain the completeness of TD with semantic annotations.
Kaz: technically, TM is not part of the normative Thing Description features, so probably it would be cleaner to split it (=TM) into a separate document on best practices or implementation guides.
Sebastian: TM was "Thing Template" within the Appendix for the Thing Description ver. 1.0, and it's becoming more official these days and moved to the section 10. However, agree there is still a possibility to make it a separate spec.
McCool: The TMs need to be addressable if we want to maintain them in the directory.
Ben: Do consumers need to understand TM if they come across a TD which links to a TM. See Example 50 The TD is incomplete with the TM.
<kaz> Example 50
McCool: We need to define validation to check this. There are also security considerations. The type definition is just to check if the TD is compatible with the model.
… The TD should be self-contained.
… If there are required fields in the TM, should we fetch the TM to validate the TD? This needs further discussion.
Sebastian: Yes, the TD is self container and the type link is just for validation.
Ben: WebThings does this kind of validation with semantic annotations. So there might be some redundancy here.
Sebastian: Yes we need to check this
Publication Roadmap
Slides
Sebastian: next working draft by end of April
… we have many open issues and with Easter holidays in between, it may not be realistic to have all included.
… regarding the candidate recommendation (CR), we need at least one more plugfest to test everything
Lagally: Adding vocabulary and links for profile will be easy
Ben: What is the deadline for adding things to TD 1.1 and what goes to TD 2?
McCool: We are behind the schedule. We should have a solid draft by June 1st.
Sebastian: We should discuss on issue tracker to know what is realistic. We also need a plugfest to test some new features.
McCool: We may need a plugfest in June. October is too late.
Lagally: canonical and signing can be included in profiles and moved to TD later.
McCool: I think canonicalization belongs to TD. It is not so difficult to add. Also, we need a formal validation process.
<kaz> +1
McCool: Deadline proposal: May 15th: complete draft. June 1st: pre-CR and plugfest
Lagally: canonicalization will be simple if we follow existing specs
McCool: validation is a lot of work, but not so complicated
McCool: validation is important before storing TDs in a directory.
Kaz: Please remember that we need an implementation report plan doc (including assertion list) and also need to identify features at risk for CR transition.
McCool: The plugfest will show the implementations status
<kaz> kaz: provide assertion lists and identify features at risk
<McCool> june plugfest should probably be a "testfest" and should have the goal of generating a draft implementation report
Ege: there should be a deadline to have feature-freeze to allow stable implementations
<McCool> ... and can then identify at-risk items
<FarshidT> sk: 10 minutes break
<kaz> [10-min break]
Thing Model
Slides
sebastian shows slides summarising issues arising from the plugfest
The required term is from JSON schema
but is at the same level as properties
McCool: move it up a level
It is at the wrong place and would preclude properties named "required"
<McCool> to dape: it can just apply to all interactions
sebastian cites work with SDF
<McCool> although... then we can't have properties and actions with the same name. Sigh...
<McCool> (anyhow... sebastian's proposals seems to address this)
<Zakim> dape, you wanted to one level up causes issues with properties/actions conflicts
dape notes that moving "required" up a level would preclude having the same name for properties and actions, etc.
McCool: sebastian's proposal uses a URI path to avoid that
Lagally: I am a bit puzzled about whether JSON-LD already provides a solution.
Sebastian: would we need to use SHACL for this?
McCool: we could use "tm:required" here
sebastian asks mlagally if he is against this proposal
Lagally: no
<benfrancis> FYI https://json-schema.org/draft/2020-12/json-schema-validation.html#rfc.section.6.5.3
<kaz> kaz: just to make sure, do you mean "required" here is affiliated with the context/namespace of "tm"?
McCool: I propose we have a context file for thing models
as distinct from thing descriptions
Cristiano: +1 for this proposal
Sebastian: I will raise this as a GitHub issue for further review
Sebastian: the extends feature doesn't yet support importing sub-definitions
<McCool> comment: likewise, "extends" could be "tm:extends", and only allowed in TMs...
Sebastian: sometimes you want to borrow part of a model, but not the whole model
<cris> we could move also ThingModel -> tm:ThingModel
<Citrullin> +1 on tm:ThingModel, I also prefer some kind of prefix for it.
Sebastian: JSON schema uses $ref, whilst SDF defines sdfRef
We need something like a macro inclusion ...
We could use tmRef or perhaps tm:Ref
McCool: this is basically a JSON pointer
adds +1
Sebastian: to clarify, this is only for tm not td
the imported definitions should be self-contained
McCool: we may need to allow for empty models in respect to validation
Koster: I agree with this
Cristiano: can you import a model and extend it?
Sebastian: yes
Koster talks through the extends behaviour
we should clarify what we mean
McCool: there is the potential for conflicts when importing stuff
I prefer to use ref to identify just what you want to pull in
Koster: the term "type" is preferable
McCool: "extends" would just indicate a dependency, but not the details
Cristiano: why do we need "extends" in that case if it is redundant?
Cristiano: tm:ref would override extends
Koster: we could say that ref doesn't change the meaning ...
"extends" brings some baggage we don't need
we need to define the processing model for interpreting TDs and TMs
Sebastian: I will create an issue to gather further review
Kaz: we want to think about compatibility between WoT TM and SDF, right?
we could explore this in the next plugfest
sebastian cites the discussion on media type for Thing Models
McCool: we need to include this in the IANA registration
we need to register our use of JSON pointer for both TD and TM
<kaz> sk: need more discussion and continue
feedback from PlugFest about SDF-TM usage
Slides
mjk provides feedback on OneDM and WoT plugfest
Exploration of generating TM/TD from oneDM's SDF
He cites a Modbus example
slide with model construction work flow
SDF doesn't really describe data schemas
it relies on the protocol and instance bindings
Koster: I also worked on semantic annotations
Some questions about terms
Sebastian: your @type is from RDF, right
I would expect a reference to RDF representations
Koster: @type just gives a URI and could point to Turtle or JSON-LD as needed
Use of RDF would enable use of RDF tools
Ben: this is what schema.org does, and we've followed it for webthings
<victor> from the JSON-LD rec: ""@type" value must a string, an array of strings, an empty object, or a default object."
mjk talks about gaps in the SDF conversion process
<victor> the RDF data model makes a distinction between resources (identified by URIs) and representations (which has a specific content types and "represent" resources)
some changes for SDF 1.1
JSON schema required for input and output elements
<victor> so, adding a content type to an RDF class shouldn't be possible
Koster: also we now have sdfChoice for enums
and a means to annotate generated TD/TM from the SDF source references
some stuff to do with fixed point decimal numbers
along with min/max and scaling
an open issue about non-linear scales, e.g. log scales
The Modbus experiments were helpful
some suggestions for vocabulary
need to specify precision for Modbus data, e.g. 16 bits
need to map array contents to properties
Sebastian: thanks for sharing this with us, please share the slides too
<sebastian> https://github.com/w3c/wot/tree/main/PRESENTATIONS/2021-03-online-f2f
Canonicalisation
Slides
Lagally: here are a few slides ...
architecture discussion now as GitHub issues
We're interested in canonical forms of TDs
useful for TD comparisons, crypto etc.
JSON has a canonicalization scheme RFC8785
we need additional rules and clarifications, e.g. default values
prefixes, array ordering, structural ordering
McCool: we also need to decide when we have multiple ways to express things
Further discussion needed
Lagally: now let's talk about signed TDs
essentially sign canonicalised form
see also RFC7515 JSON web signature
envisage need to update the signing algorithms
open issues around self contained TDs and TMs
McCool talks about role of URLs in signing
McCool: I don't think we need to rely on JSON-LD canonicalisation
Some challenges with strings
<Zakim> dape, you wanted to default values are omitted. 2 questiosn: 1. remove default values 2. likelihood to have additional default values
Daniel: omitting default values, means having to prune them before signing
McCool: if a property has the default value, we should omit it as it is redundant
Defaults can be changed at major version updates
McCool: signing involves chaining
the signature says who you have to trust
we need to canonicalise relative URIs etc.
Lagally: forcing alphabetically sorted statements is bad for intelligibility
better to specify ordering
McCool: I will prepare a pull request
IoT Schema
Slides
The IoT schema CG is rechartering
we're more aligned with web of things than schema.org focus on web search
The idea is to support a catalogue of device descriptions
There are a lot of existing models we can import
we want to normalise the models from the heterogeneous representations
is there enough critical mass, can we build a new consortium?
from those already in W3C
we're looking for people to sign up ...
Ben: I am very supportive of this work, we have 20 plus models for webthings
happy to contribute those models
what is the overlap with thing models? I will create a GitHub issue to discuss this further
Koster: good question
Sebastian: +1
<kaz> +1
<benfrancis> FYI https://webthings.io/schemas
Sebastian: we could include examples in the TD spec
<sebastian> https://www.w3.org/community/iotschema/#
Everyone is welcome to join the CG
<victor> I've got to go. Have a good day, all
kaz makes a comment on where work takes place, e.g., possibly we WoT group work on requirements and ask the CG(s) and SDF guys to work on actual spec work instead :)
sebastian wraps up today's meeting.
Tomorrow
Slides
McCool: we have a tight schedule tomorrow, please email me any agenda changes
Sebastian: no WoT calls next week
<kaz> [adjourned]
Meeting minutes
Scribes for today
scribes so far: Ege, Kaz, Daniel, Sebastian, McCool, Philipp, Cristiano, Dave, Farshid
Koster for the 1st part; Cristiano for the 2nd part
Logistics
Opening slides
McCool: no calls next week including the main call
… next main call will be on April 7
McCool: there are 2 weeks spring break coming up
McCool: planning to have the discovery meeting on Monday the 29th
scripting update
<kaz> Scripting slides
Spec status
Zoltan: summary of PRs and outstanding issues
… changes in InteractionInput and InteractionOutput data modeling
… uses either JSON value or readable stream
… JSON value if there is a schema, otherwise a stream with a content type
ConsumedThing and ExposedThing
Zoltan: the consumed thing changes include typing data input and output
Cristiano: should we check more conditions on validating consumed TDs?
… there is an issue with exposed thing as well
McCool: there might be 2 more stages of validation beyond schema validation
McCool: the categories are basic, intermediate, and advanced
sebastian: json schema is only one part of the validation process
… additional terms that are in JSON schema are allowed
McCool: we could use a stricter schema and require extensions to have a context file with a defined prefix
… also require assertions to drive validation
… there is JSON schema, then a script, then shape validation
Ege: the assertions are only checked for being present
McCool: we can annotate the implementation report with which assertions are checked
Zoltan: can we continue this discussion later?
… we need to know how this works for security
Zoltan: we need to discuss how are exposed things processed
… the exposed thing constructor is used to create a JSON object, then handlers are bound
… then expose() is called, then a TD is retrieved
… a big change was made, exposed thing doesn't inherit from consumed thing anymore
… exposed thing can be consumed using the consumed thing interface
… need review and input from the TD task force
… created an issue about how to create a TD for exposed thing
ConsumedThing
Cristiano: the consequence is that you need to use the consumed thing interface to invoke actions in an exposed thing
… thinking about bypassing the network interface when consuming things locally
Zoltan: for example using a local handler
Daniel: another reason for splitting this is that consumed thing properties are read-only
… and exposed thing properties may be updated
Cristiano: we don't have default handlers, now we need to set these handlers
Zoltan: it makes it harder to write identity functions
Zoltan: discussion about initialization of exposed thing
Cristiano: we use a dictionary as a partial TD and expand into a TD
… then validate and add required elements that are missing
… for example forms and security schemas
… cris: there should be a default form generation
Cristiano: there are issues how to handle title processing, whether it should be required
… partial TDs don't have mandatory elements
Sebastian: there is a question in the profile discussion about mandatory fields
McCool: title is more important than description because it's used in UIs
… also how do we handle language localization?
Discovery
Zoltan: discussion of discovery
… discovery is optional to implement, node-wot is planning to implement
… there is a common pattern emerging for discovery
… there is a request with filters and a wait for replies
… the API provides control to start and stop discovery and uses a cursor pattern to handle multiple outputs
… next step is API alignment and implement phase 1
… considering using the fetch API and readable stream data type
McCool: we should consider 2 discovery methods: automatic and direct
script management
Zoltan: runtime provisioning
node-wot update
Daniel: available releases are all marked v0.7
… working on a new branch with the new API
… there is a running instance for testing
… currently node-wot doesn't support discovery
… you need to provide a TD or location for the TD to be fetched from
… there is a management API for starting and stopping services
… management uses a TD
McCool: suggest setting up a meeting with security to discuss permissions
<kaz> (+1 to the joint discussion)
Daniel: node-wot has current support for many protocols and media types, some partially supported
… NPM statistics show a lot of interest and many downloads
Zoltan: there are peaks of activity around TPACs
Zoltan: any questions, discussion?
McCool: schedule the joint call for the week after break, week of the 12th
Zoltan: we could use the security call
McCool: short break and return for the plugfest discussion
<kaz> [5-min break]
plugfest
<kaz> Plugfest report slides
McCool: I'll go quickly through the results of the past pf. If you know more information please head to the issue tracker.
… we had a lot of different projects
McCool goes through the list of the projects
W-ADE and A-MaGe
McCool: W-ADE and A-MaGe are from Ege
… W-ADE is an ID for TDs where you can interact with your TDs
… you can also expose virtual Things
… A-MaGe is abaout Mashup applications
… we tested those tools with coap thanks to Philipp Blum
… we discovered that we can fetch tds using coap
… we might to add this to the discovery document
… semantic annotations are good to help the interaction
Testbench
McCool: then we have Testbench
… it's an evaluation tool to test online web things
… it was tested with different implementations
Ege: we pointed some possible problems in the slides
OneDM
McCool: tool for generating TM/TD from oneDM file
<sebastian> https://github.com/roman-kravtsov/sdf-object-converter
McCool: we have alignment for using json pointers with SDF
… also there was a discussion about value scaling
… sdf has scaleMinimun etc.
… Michael Koster also experimented with the modbus protocol biding
Node-Red discovery
McCool: now node-red gen is able to automatically discover TDs in a TDDs
… and generate nodes to interact with it
ediTDor
McCool: now it supports TMs and some support for SDF
… the flattening process defined before in scripting can be used for TMs
McCool: ok that's it let's go in the next session
marketing
<kaz> [slides tbd@@@]
seb is presenting
Sebastian: we have a new webpage
… I really like it
… it's a good starting point
… where people can get in touch with us and with WoT topics
Sebastian: we also have an explainer video
… is on youtube
https://www.youtube.com/watch?v=WMFXg-kni0U
Sebastian: we'll try to support multiple sub titles
… japanese, german, french, chinese, italian ...
… now we already have Chinese and Italian
… it would be best to have a double review for each language
Ege: it was an issue about synchronization in the original file
… is it solved?
… then also the newly added subtitle are not yet on youtube
Sebastian: yep we have to do it manually
Kaz: let's talk about the issue during the marketing call
Sebastian: we are glad to add other languages
McCool: the current supported list is good
Sebastian: maybe we should add spanish
agreement on publishing material
<Ege> wot-marketing issue 151 - Documenting the Twitter publication rules
Sebastian: there is some minor publishing material that it would be not feasible to get acceptance
… we also need to be reactive on twitter
… responses, retweet, likes etc.
… I'd propose a small subset of tweets that could skip the validation
Sebastian: what do you think?
McCool: how do we decide what's in the "boring" tweet set?
… probably we could authorize particular people to publish directly on twitter
… if others needs to tweet something they could refer to them
… chairs could occupy that role
Kaz: +1. There are several questions mostly about content and procedure
… boredom or excitement should not be really a deciding material
… I like the procedure described by mm. It would be better to define another term instead of boring :)
McCool: we should be careful for short text
… in meetings people can ask directly to chairs
Cristiano: what about retweet?
McCool: I think we should follow the same review process
Ege: I had a long list of questions about this
… could you please check it
McCool: let's document the process on this issue
McCool: email would be my preferred way
Ege: so this review process will apply for everything in the list?
McCool: yes
<citrullin> +1 on that. It's a lot of overhead.
Ege: my contribution with this process would be far less.
McCool: we could add Ege as a possible responsible
… this is something to discuss
Kaz: I know that I've been very strict, but we have to remind ourselves that we represent the whole W3C community.
Sebastian: don't we represent just WoT group?
Kaz: representing the W3C WoT WG means we represent the whole consortium
McCool: right
Dave: what about tweets in different languages?
Ege: I would not tweet in different lang
McCool: maybe we could have a dedicated account for each idiom
Daniel: I would not do that
Sebastian: also we don't have the man power to handle all these accounts
… plus twitter has the translate functionality
McCool: true, let's stick with english
Philipp: I understand the concerns
<Ege> thanks philipp! :)
Philipp: we should increase the tweet activity
McCool: we could give the authority to ege since is very active
Sebastian: let's sort this out today
McCool: ok first let's agree about the procedure
Sebastian: what about a warning system?
McCool: yeah, just ege be responsible please
Kaz: again please note I'm not saying we need censorship but just suggesting we should review the content before publishing it. we should be conscious about the CONTENT of the tweet... no misunderstanding, no spelling issues, no typos, etc.
<McCool> proposal: https://github.com/w3c/wot-marketing/issues/151#issuecomment-806784807
<citrullin> I guess that could be considered to be a more critical topic
Lagally: tweets should contains a little bit of consensus. Although simply retweeting usually is not harmful as long it does not endorse a particular company or organization.
… if there anything that is controversial it is better ask.
McCool: is someone object to the current process?
Resolution: https://github.com/w3c/wot-marketing/issues/151#issuecomment-806784807
Kaz: we should copy the procedure to marketing wiki
<Ege> https://github.com/w3c/wot-marketing/pull/117
Sebastian: I had other topics to discuss like liaisons and how to increase community awareness
… let's discuss in the next calls
McCool: I'll send email with the cancellations about the next couple of weeks
Kaz: all the TF leaders, please also update the main wikipage
McCool: thank you for your participation of this virtual F2F
Sebastian: also would appreciate all your contributions!
Closing Slides
<kaz> [adjourned]