Copyright © 2023 World Wide Web Consortium. W3C® liability, trademark and permissive document license rules apply.
This specification describes a Data Integrity Cryptosuite for use when generating digital signatures using the BBS signature scheme. The Signature Suite utilizes BBS signatures to provide selective disclosure and unlinkable derived proofs.
This section describes the status of this document at the time of its publication. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at https://www.w3.org/TR/.
This is an experimental specification and is undergoing regular revisions. It is not fit for production deployment.
This document was published by the Verifiable Credentials Working Group as a Working Draft using the Recommendation track.
Publication as a Working Draft does not imply endorsement by W3C and its Members.
This is a draft document and may be updated, replaced or obsoleted by other documents at any time. It is inappropriate to cite this document as other than work in progress.
This document was produced by a group operating under the W3C Patent Policy. W3C maintains a public list of any patent disclosures made in connection with the deliverables of the group; that page also includes instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.
This document is governed by the 12 June 2023 W3C Process Document.
This specification defines a cryptographic suite for the purpose of
creating, verifying, and deriving proofs using the BBS Signature Scheme in
conformance with the Data Integrity [VC-DATA-INTEGRITY] specification. The
BBS signature scheme directly provides for selective disclosure and unlinkable
proofs. It provides four high-level functions that work within the issuer,
holder, verifier model. Specifically, an issuer uses the BBS Sign
function to
create a cryptographic value known as a "BBS signature" which is used in signing
the original credential. A holder, on receipt of
a credential signed with BBS, then verifies the credential with the BBS Verify
function.
The holder then chooses information to selectively disclose from the
received credential and uses the BBS ProofGen
function to generate a
cryptographic value, known as a "BBS proof", which is used in creating a proof
for this "derived credential". The cryptographic "BBS proof" value is not linkable
to the original "BBS signature" and a different, unlinkable "BBS proof" can be
generated by the holder for additional "derived credentials", including any
containing the exact same information.
Finally, a verifier uses the BBS ProofVerify
function to verify the derived
credential received from the holder.
Applying the BBS signature scheme to verifiable credentials involves the
processing specified in this document.
In general the suite uses the RDF Dataset Normalization Algorithm
[RDF-DATASET-NORMALIZATION] to transform an input document into its canonical
form. An issuer then uses selective disclosure primitives to separate the
canonical form into mandatory and non-mandatory statements. These are processed
separately with other information to serve as the inputs to the BBS Sign
function along with appropriate key material. This output is used to
generate a secured credential. A holder uses a set of selective disclosure
functions and the BBS Verify
function on receipt of the credential
to ascertain validity.
Similarly, on receipt of a BBS signed credential, a holder uses the RDF Dataset
Normalization Algorithm [RDF-DATASET-NORMALIZATION] to transform an input
document into its canonical form, and then applies selective disclosure
primitives to separate the canonical form into mandatory and selectively
disclosed statements, which are appropriately processed and serve as inputs to
the BBS ProofGen
function. Suitably processed, the output of this function
becomes the signed selectively disclosed credential sent to a verifier. Using
canonicalization and selective disclosure primitives, the verifier can then use
the BBS verifyProof
function to validate the credential.
This section defines the terms used in this specification. A link to these terms is included whenever they appear in this specification.
example.com
, an
ad-hoc value such as mycorp-level3-access
, or a very
specific transaction value like 8zF6T8J34qP3mqP
. A signer could
include a domain in its digital proof to restrict its use
to particular target, identified by the specified domain.
id
property in a controller document.
Anything can be a subject: person, group, organization, physical thing, digital
thing, logical thing, etc.
A set of parameters that can be used together with a process to independently verify a proof. For example, a cryptographic public key can be used as a verification method with respect to a digital signature; in such usage, it verifies that the signer possessed the associated cryptographic private key.
"Verification" and "proof" in this definition are intended to apply broadly. For example, a cryptographic public key might be used during Diffie-Hellman key exchange to negotiate a shared symmetric key for encryption. This guarantees the integrity of the key agreement process. It is thus another type of verification method, even though descriptions of the process might not use the words "verification" or "proof."
As well as sections marked as non-normative, all authoring guidelines, diagrams, examples, and notes in this specification are non-normative. Everything else in this specification is normative.
The key words MAY, MUST, MUST NOT, and SHOULD in this document are to be interpreted as described in BCP 14 [RFC2119] [RFC8174] when, and only when, they appear in all capitals, as shown here.
A conforming proof is any concrete expression of the data model that complies with the normative statements in this specification. Specifically, all relevant normative statements in Sections 2. Data Model and 3. Algorithms of this document MUST be enforced.
A conforming processor is any algorithm realized as software and/or hardware that generates or consumes a conforming proof. Conforming processors MUST produce errors when non-conforming documents are consumed.
This document contains examples of JSON and JSON-LD data. Some of these examples
are invalid JSON, as they include features such as inline comments (//
)
explaining certain portions and ellipses (...
) indicating the omission of
information that is irrelevant to the example. Such parts need to be
removed if implementers want to treat the examples as valid JSON or JSON-LD.
The following sections outline the data model that is used by this specification for verification methods and data integrity proof formats.
These verification methods are used to verify Data Integrity Proofs [VC-DATA-INTEGRITY] produced using BLS12-381 cryptographic key material that is compliant with [CFRG-BBS-SIGNATURE]. The encoding formats for these key types are provided in this section. Lossless cryptographic key transformation processes that result in equivalent cryptographic key material MAY be used during the processing of digital signatures.
The Multikey format, as defined in [VC-DATA-INTEGRITY], is used to express public keys for the cryptographic suites defined in this specification.
The publicKeyMultibase
property represents a Multibase-encoded Multikey
expression of a BLS12-381 public key in the G2 group. The encoding of this field
is the two-byte prefix 0xeb01
followed
by the 96-byte compressed public key data.
The 98-byte value is then encoded using base58-btc (z
) as the prefix. Any
other encodings MUST NOT be allowed.
Developers are advised to not accidentally publish a representation of a private
key. Implementations of this specification will raise errors in the event of a
[MULTICODEC] value other than 0xeb01
being used in a
publicKeyMultibase
value.
{
"id": "https://example.com/issuer/123#key-0",
"type": "Multikey",
"controller": "https://example.com/issuer/123",
"publicKeyMultibase": "zUC7EK3ZakmukHhuncwkbySmomv3FmrkmS36E4Ks5rsb6VQSRpoCrx6
Hb8e2Nk6UvJFSdyw9NK1scFXJp21gNNYFjVWNgaqyGnkyhtagagCpQb5B7tagJu3HDbjQ8h
5ypoHjwBb"
}
{
"@context": [
"https://www.w3.org/ns/did/v1",
"https://w3id.org/security/data-integrity/v1"
],
"id": "https://example.com/issuer/123",
"verificationMethod": [{
"id": "https://example.com/issuer/123#key-1",
"type": "Multikey",
"controller": "https://example.com/issuer/123",
"publicKeyMultibase": "zUC7EK3ZakmukHhuncwkbySmomv3FmrkmS36E4Ks5rsb6VQSRpoCr
x6Hb8e2Nk6UvJFSdyw9NK1scFXJp21gNNYFjVWNgaqyGnkyhtagagCpQb5B7tagJu3HDbjQ8h
5ypoHjwBb"
}]
}
This suite relies on detached digital signatures represented using [MULTIBASE] and [MULTICODEC].
The verificationMethod
property of the proof MUST be a URL.
Dereferencing the verificationMethod
MUST result in an object
containing a type
property with the value set to
Multikey
.
The type
property of the proof MUST be DataIntegrityProof
.
The cryptosuite
property of the proof MUST be bbs-2023
.
The created
property of the proof MUST be an [XMLSCHEMA11-2]
formatted date string.
The proofPurpose
property of the proof MUST be a string, and MUST
match the verification relationship expressed by the verification method
controller
.
The value of the proofValue
property of the proof MUST be an BBS signature or
BBS proof produced according to [CFRG-BBS-SIGNATURE] then serialized and encoded
according to procedures in section 3. Algorithms.
The following algorithms describe how to use verifiable credentials with the BBS Signature Scheme [CFRG-BBS-SIGNATURE]. When using the BBS signature scheme the SHAKE-256 variant SHOULD be used.
Implementations SHOULD fetch and cache verification method information as early as possible when adding or verifying proofs. Parameters passed to functions in this section use information from the verification method — such as the public key size — to determine function parameters — such as the cryptographic hashing algorithm.
When the RDF Dataset Canonicalization Algorithm [RDF-CANON] is used, implementations of that algorithm will detect dataset poisoning by default, and abort processing upon detection.
The following algorithm creates a label map factory function that uses an HMAC to shuffle canonical blank node identifiers. The required input is an HMAC (previously initialized with a secret key), HMAC. A function, labelMapFactoryFunction, is produced as output.
bnodeIdMap
as follows:
hmacIds
to be the sorted array of values from the bnodeIdMap
, and set
bnodeKeys
to be the ordered array of keys from the bnodeIdMap
.
bnodeKeys
, replace the bnodeIdMap
value for that key with the
index position of the value in the hmacIds
array prefixed by "b", i.e.,
bnodeIdMap.set(bkey, 'b' + hmacIds.indexOf(bnodeIdMap.get(bkey)))
.
The following algorithm serializes the base proof value, including the BBS signature, HMAC key, and mandatory pointers. The required inputs are a base signature bbsSignature, an HMAC key hmacKey, and an array of mandatoryPointers. A single base proof string value is produced as output.
proofValue
, that starts with the BBS base proof
header bytes 0xd9
, 0x5d
, and 0x02
.
components
to an array with five elements containing the values of:
bbsSignature
, hmacKey
, and mandatoryPointers
.
components
and append it to proofValue
.
baseProof
to a string with the multibase-base64url-no-pad-encoding
of proofValue
. That is, return a string starting with "u
" and ending with the
base64url-no-pad-encoded value of proofValue
.
baseProof
as base proof.
The following algorithm parses the components of a bbs-2023
selective
disclosure base proof value. The required input is a proof value
(proofValue). A single object, parsed base proof, containing
three elements, using the names "bbsSignature", "hmacKey",
and "mandatoryPointers", is produced as output.
proofValue
string starts with u
, indicating that it is a
multibase-base64url-no-pad-encoded value, and throw an error if it does not.
decodedProofValue
to the result of base64url-no-pad-decoding the
substring following the leading u
in proofValue
.
decodedProofValue
starts with the BBS base proof header
bytes 0xd9
, 0x5d
, and 0x02
, and throw an error if it does not.
components
to an array that is the result of CBOR-decoding the
bytes that follow the three-byte ECDSA-SD base proof header. Ensure the result
is an array of three elements.
The following algorithm creates data to be used to generate a derived proof. The inputs include a JSON-LD document (document), a BBS base proof (proof), an array of JSON pointers to use to selectively disclose statements (selectivePointers), and any custom JSON-LD API options (such as a document loader). A single object, disclosure data, is produced as output, which contains the "bbsProof", "labelMap", "mandatoryIndexes", "selectiveIndexes", and "revealDocument" fields.
bbsSignature
, hmacKey
, and
mandatoryPointers
to the values of the associated properties in the object
returned when calling the algorithm in Section
3.2.2 parseBaseProofValue, passing the proofValue
from proof
.
hmac
to an HMAC API using hmacKey
. The HMAC uses the same hash
algorithm used in the signature algorithm, i.e., SHAKE-256.
labelMapFactoryFunction
to the result of calling the
createShuffledIdLabelMapFunction
algorithm passing hmac
as HMAC
.
combinedPointers
to the concatenation of mandatoryPointers
and selectivePointers
.
groupDefinitions
to a map with the following entries: key of
the string "mandatory"
and value of mandatoryPointers
; key of the string
"selective"
and value of selectivePointers
; and key of the string "combined"
and value of combinedPointers
.
groups
and labelMap
to the result of calling the algorithm in
Section 3.3.16
canonicalizeAndGroup of the [DI-ECDSA] specification, passing document
labelMapFactoryFunction
,
groupDefinitions
, and any custom JSON-LD
API options. Note: This step transforms the document into an array of canonical
N-Quads whose order has been shuffled based on 'hmac' applied blank node
identifiers, and groups
the N-Quad strings according to selections based on JSON pointers.
mandatoryIndexes
to an empty array. Set mandatoryMatch
to
groups.mandatory.matching
map; set combinedMatch
to
groups.combined.matching
; and set combinedIndexes
to the ordered array of
just the keys of the combinedMatch
map.
mandatoryMatch
map, find its index in the combinedIndexes
array (e.g., combinedIndexes.indexOf(key)
), and add this value to the
mandatoryIndexes
array.
selectiveIndexes
to an empty array. Set selectiveMatch
to the
groups.selective.matching
map; set mandatoryNonMatch
to the map
groups.mandatory.nonMatching
; and nonMandatoryIndexes
to to the ordered array of
just the keys of the mandatoryNonMatch
map.
selectiveMatch
map, find its index in the nonMandatoryIndexes
array (e.g., nonMandatoryIndexes.indexOf(key)
), and add this value to the
selectiveIndexes
array.
bbsMessages
to an array of byte arrays obtained from the
UTF-8 encoding of the the values in the nonMandatory
array.
bbsHeader
using the following steps:
proofHash
to the result of calling the RDF Dataset Canonicalization
algorithm [RDF-CANON] on proof
with the proofValue
removed and then
cryptographically
hashing the result using the same hash that is used by the signature algorithm,
i.e., SHAKE-256. Note: This step can be performed in parallel;
it only needs to be completed before this algorithm terminates, as the result is
part of the return value.
mandatoryHash
to the result of calling the algorithm in
Section 3.3.17
hashMandatoryNQuads of the [DI-ECDSA] specification, passing the values
from the map
groups.mandatory.matching and utilizing the SHAKE-256 algorithm.
bbsHeader
to the concatenation of proofHash
and mandatoryHash
in that
order.
bbsProof
to the value computed by the ProofGen
procedure from
[CFRG-BBS-SIGNATURE], i.e. ProofGen(PK, signature, header, ph, messages, disclosed_indexes)
,
where PK
is the original issuers public key, signature
is the
bbsSignature
, header
is the bbsHeader
, ph
is an empty byte array,
messages
is bbsMessages
, and disclosed_indexes
is selectiveIndexes
.
document
, and combinedPointers
as pointers
.
inputLabel
) and value (verifierLabel
) in `canonicalIdMap:
verifierLabelMap
, using verifierLabel
as the key, and the
value associated with inputLabel
as a key in labelMap
as the value.
bbsProof
, "verifierLabelMap" for labelMap
,
mandatoryIndexes
, selectiveIndexes
, and revealDocument
.
The following algorithm compresses a label map. The required input is label map (labelMap). The output is a compressed label map.
map
to an empty map.
k
, v
) in labelMap
:
map
, with a key that is a base-10 integer parsed from the
characters following the "c14n" prefix in k
, and a value that is a base-10
integer parsed from the characters following the "b" prefix in v
.
map
as compressed label map.
The following algorithm decompresses a label map. The required input is a compressed label map (compressedLabelMap). The output is a decompressed label map.
map
to an empty map.
k
, v
) in compressedLabelMap
:
map
, with a key that adds the prefix "c14n" to k
, and a value
that adds a prefix of "b" to v
.
map
as decompressed label map.
The following algorithm serializes a derived proof value. The required inputs are a BBS proof (bbsProof), a label map (labelMap), an array of mandatory indexes (mandatoryIndexes), and an array of selective indexes (selectiveIndexes). A single derived proof value, serialized as a byte string, is produced as output.
compressedLabelMap
to the result of calling the algorithm in
Section 3.2.4 compressLabelMap, passing labelMap
as the parameter.
proofValue
, that starts with the BBS disclosure
proof header bytes 0xd9
, 0x5d
, and 0x03
.
components
to an array with four elements containing the values of
bbsProof
, compressedLabelMap
, mandatoryIndexes
, and selectiveIndexes
.
components
and append it to proofValue
.
proofValue
. That is, return a string
starting with "u
" and ending with the base64url-no-pad-encoded value of
proofValue
.
The following algorithm parses the components of the derived proof value. The required input is a derived proof value (proofValue). A A single derived proof value value object is produced as output, which contains a set of five elements, using the names "bbsProof", "labelMap", "mandatoryIndexes", and "selectiveIndexes".
proofValue
string starts with u
, indicating that it is a
multibase-base64url-no-pad-encoded value, and throw an error if it does not.
decodedProofValue
to the result of base64url-no-pad-decoding the
substring that follows the leading u
in proofValue
.
decodedProofValue
starts with the ECDSA-SD disclosure proof
header bytes 0xd9
, 0x5d
, and 0x03
, and throw an error if it does not.
components
to an array that is the result of CBOR-decoding the
bytes that follow the three-byte BBS disclosure proof header. Ensure the result
is an array of four elements —
a byte array, a map of integers to integers, an
array of integers, and another array of integers; otherwise, throw an error.
components
using the result of calling the
algorithm in Section 3.2.5 decompressLabelMap, passing the existing
second element of components
as compressedLabelMap
.
bbsProof
", "labelMap
", "mandatoryIndexes
", and
"selectiveIndexes
" respectively.
The following algorithm creates the data needed to perform verification of a
BBS-protected verifiable credential. The inputs include a JSON-LD
document (document), a BBS disclosure proof (proof),
and any custom JSON-LD API options (such as a document loader). A single
verify data object value is produced as output containing the following
fields: "bbsProof
", "proofHash
", "mandatoryHash
", "selectedIndexes
", and
"nonMandatory
".
proofHash
to the result of performing RDF Dataset Canonicalization
[RDF-CANON] on the proof options, i.e., the proof portion of the document
with the proofValue
removed. The hash used is the same as that used in
the signature algorithm, i.e., SHA-256 for a P-256 curve. Note: This step can be
performed in parallel; it only needs to be completed before this algorithm needs
to use the proofHash
value.
bbsProof
, labelMap
, mandatoryIndexes
, and selectiveIndexes
to
the values associated with their property names in the object returned when
calling the algorithm in Section
3.2.7 parseDerivedProofValue, passing proofValue
from proof
.
labelMapFactoryFunction
to the result of calling the
"createLabelMapFunction
" algorithm.
nquads
to the result of calling the "labelReplacementCanonicalize
"
algorithm of [DI-ECDSA], passing document
, labelMapFactoryFunction
, and
any custom
JSON-LD API options. Note: This step transforms the document into an array of
canonical N-Quads with pseudorandom blank node identifiers based on labelMap
.
mandatory
to an empty array.
nonMandatory
to an empty array.
index
, nq
) in nquads
, separate the N-Quads into mandatory
and non-mandatory categories:
mandatoryIndexes
includes index
, add nq
to mandatory
.
nq
to nonMandatory
.
mandatoryHash
to the result of calling the "hashMandatory
"
primitive, passing mandatory
.
baseSignature
, proofHash
,
nonMandatory
, mandatoryHash
, and selectiveIndexes
.
The bbs-2023
cryptographic suite takes an input document, canonicalizes
the document using the Universal RDF Dataset Canonicalization Algorithm
[RDF-CANON], and then applies a number of transformations and cryptographic
operations resulting in the production of a data integrity proof. The algorithms
in this section also include the verification of such a data integrity proof.
To generate a base proof, the algorithm in Section 4.1: Add Proof of the Data Integrity [VC-DATA-INTEGRITY] specification MUST be executed. For that algorithm, the cryptographic suite specific transformation algorithm is defined in Section 3.3.2 Base Proof Transformation (bbs-2023), the hashing algorithm is defined in Section 3.3.3 Base Proof Hashing (bbs-2023), and the proof serialization algorithm is defined in Section 3.3.5 Base Proof Serialization (bbs-2023).
The following algorithm specifies how to transform an unsecured input document into a transformed document that is ready to be provided as input to the hashing algorithm in Section 3.3.3 Base Proof Hashing (bbs-2023).
Required inputs to this algorithm are an unsecured data document (unsecuredDocument) and transformation options (options). The transformation options MUST contain a type identifier for the cryptographic suite (type), a cryptosuite identifier (cryptosuite), and a verification method (verificationMethod). The transformation options MUST contain an array of mandatory JSON pointers (mandatoryPointers) and MAY contain additional options, such as a JSON-LD document loader. A transformed data document is produced as output. Whenever this algorithm encodes strings, it MUST use UTF-8 encoding.
hmac
to an HMAC API using a locally generated and exportable HMAC
key. The HMAC uses the same hash algorithm used in the signature algorithm,
i.e., SHAKE-256.
labelMapFactoryFunction
to the result of calling the
createShuffledIdLabelMapFunction
algorithm passing hmac
as HMAC
.
groupDefinitions
to a map with an entry with a key of the string
"mandatory
" and a value of mandatoryPointers.
groups
to the result of calling the algorithm in
Section 3.3.16
canonicalizeAndGroup of the [DI-ECDSA] specification, passing
labelMapFactoryFunction
,
groupDefinitions
, unsecuredDocument
as document
, and any custom JSON-LD
API options. Note: This step transforms the document into an array of canonical
N-Quads whose order has been shuffled based on 'hmac' applied blank node
identifiers, and groups
the N-Quad strings according to selections based on JSON pointers.
mandatory
to the values in the groups.mandatory.matching
map.
nonMandatory
to the values in the groups.mandatory.nonMatching
map.
hmacKey
to the result of exporting the HMAC key from hmac
.
mandatoryPointers
" set to mandatoryPointers
,
"mandatory
" set to mandatory
, "nonMandatory
" set to nonMandatory
,
and "hmacKey
" set to hmacKey
.
The following algorithm specifies how to cryptographically hash a transformed data document and proof configuration into cryptographic hash data that is ready to be provided as input to the algorithms in Section 3.3.5 Base Proof Serialization (bbs-2023).
The required inputs to this algorithm are a transformed data document (transformedDocument) and canonical proof configuration (canonicalProofConfig). A hash data value represented as an object is produced as output.
proofHash
to the result of calling the RDF Dataset Canonicalization
algorithm [RDF-CANON] on canonicalProofConfig
and then cryptographically
hashing the result using the same hash that is used by the signature algorithm,
i.e., SHAKE-256. Note: This step can be performed in parallel;
it only needs to be completed before this algorithm terminates, as the result is
part of the return value.
mandatoryHash
to the result of calling the the algorithm in
Section 3.3.17
hashMandatoryNQuads of the [DI-ECDSA] specification, passing
transformedDocument.mandatory
and utilizing the SHAKE-256
algorithm.
hashData
as a deep copy of transformedDocument, and
add proofHash
as "proofHash
" and mandatoryHash
as "mandatoryHash
" to that
object.
hashData
as hash data.
The following algorithm specifies how to generate a proof configuration from a set of proof options that is used as input to the base proof hashing algorithm.
The required inputs to this algorithm are proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MUST contain a cryptosuite identifier (cryptosuite). A proof configuration object is produced as output.
DataIntegrityProof
and
proofConfig.cryptosuite is not set to bbs-2023
, an
INVALID_PROOF_CONFIGURATION
error MUST be raised.
INVALID_PROOF_DATETIME
error MUST be raised.
The following algorithm, to be called by an issuer of a BBS-protected Verifiable Credential, specifies how to create a base proof. The base proof is to be given only to the holder, who is responsible for generating a derived proof from it, exposing only selectively disclosed details in the proof to a verifier. This algorithm is designed to be used in conjunction with the algorithms defined in the Data Integrity [VC-DATA-INTEGRITY] specification, Section 4: Algorithms. Required inputs are cryptographic hash data (hashData) and proof options (options). The proof options MUST contain a type identifier for the cryptographic suite (type) and MAY contain a cryptosuite identifier (cryptosuite). A single digital proof value represented as series of bytes is produced as output.
proofHash
, mandatoryPointers
, mandatoryHash
, nonMandatory
,
and hmacKey
to the values associated with their property names in
hashData.
bbsHeader
to the concatenation of proofHash
and mandatoryHash
in
that order.
bbsMessages
to an array of byte arrays obtained from the
UTF-8 encoding of the the values in the nonMandatory
array.
bbsSignature
using the Sign
procedure of [CFRG-BBS-Signature]
with appropriate key material and bbsHeader
for the header
and bbsMessages
for the messages
bbsSignature
,
hmacKey
, and mandatoryPointers
as parameters
to the algorithm.
proofValue
as digital proof.
The following algorithm, to be called by a holder of a bbs-2023
-protected
verifiable credential, creates a selective disclosure derived proof.
The derived proof is to be given to the verifier. The inputs include a
JSON-LD document (document), a BBS base proof
(proof), an array of JSON pointers to use to selectively disclose
statements (selectivePointers), and any custom JSON-LD API options,
such as a document loader. A single selectively revealed document
value, represented as an object, is produced as output.
bbsProof
, labelMap
, mandatoryIndexes
, selectiveIndexes
, and
revealDocument
to the values associated with their
property names in the object returned when calling the algorithm in
Section 3.2.3 createDisclosureData, passing the document
, proof
,
selectivePointers
, and any custom JSON-LD API options, such as a document
loader.
newProof
to a shallow copy of proof
.
proofValue
in newProof
with the result of calling the algorithm
in Section 3.2.6 serializeDerivedProofValue, passing bbsProof
,
labelMap
, mandatoryIndexes
, and selectiveIndexes
.
proof
" property in revealDocument
to newProof
.
revealDocument
as the selectively revealed document.
The following algorithm attempts verification of a bbs-2023
derived
proof. This algorithm is called by a verifier of an BBS-protected
verifiable credential. The inputs include a JSON-LD document
(document), a BBS disclosure proof (proof), and any
custom JSON-LD API options (such as a document loader). A single boolean
verification result value is produced as output.
bbsProof
, proofHash
, mandatoryHash
, selectedIndexes
, and
nonMandatory
to the values associated with their property
names in the object returned when calling the algorithm in Section
3.2.8 createVerifyData, passing the document
, proof
, and any
custom JSON-LD API options (such as a document loader).
bbsHeader
to the concatenation of proofHash
and mandatoryHash
in that order. Initialize disclosedMessages
to an array of byte arrays
obtained from the UTF-8 encoding of the elements of the nonMandatory
array.
verificationResult
to the result of applying the verification
algorithm ProofVerify
of [CFRG-BBS-SIGNATURE]
with PK
set as the public key of the original issuer, proof
set as bbsProof
,
header
set as bbsHeader
, disclosed_messages
set as disclosedMessages
,
ph
set as an empty byte array, and disclosed_indexes
set as
selectiveIndexes
. Return verificationResult
as verification result.
TODO: We need to add a complete list of privacy considerations.
TODO: We need to add a complete list of security considerations.
This section is non-normative.
Demonstration of selective disclosure features including mandatory disclosure, selective disclosure, and overlap between those, requires an input credential document with more content than previous test vectors. To avoid excessively long test vectors, the starting document test vector is based on a purely fictitious windsurfing (sailing) competition scenario. In addition, we break the test vectors into two groups, based on those that would be generated by the issuer (base proof) and those that would be generated by the holder (derived proof).
To add a selective disclosure base proof to a document, the issuer needs the following cryptographic key material:
The key material used for generating the test vectors to test add base proof is shown below. Hexadecimal representation is used for the BBS key pairs and the HMAC key.
{ "publicKeyHex": "a4ef1afa3da575496f122b9b78b8c24761531a8a093206ae7c45b80759c168ba4f7a260f9c3367b6c019b4677841104b10665edbe70ba3ebe7d9cfbffbf71eb016f70abfbb163317f372697dc63efd21fc55764f63926a8f02eaea325a2a888f", "privateKeyHex": "66d36e118832af4c5e28b2dfe1b9577857e57b042a33e06bdea37b811ed09ee0", "hmacKeyString": "00112233445566778899AABBCCDDEEFF00112233445566778899AABBCCDDEEFF" }
In our scenario, a sailor is registering with a race organizer for a series of windsurfing races to be held over a number of days on Maui. The organizer will inspect the sailor's equipment to certify that what has been declared is accurate. The sailor's unsigned equipment inventory is shown below.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", { "@vocab": "https://windsurf.grotto-networking.com/selective#" } ], "type": [ "VerifiableCredential" ], "credentialSubject": { "sailNumber": "Earth101", "sails": [ { "size": 5.5, "sailName": "Kihei", "year": 2023 }, { "size": 6.1, "sailName": "Lahaina", "year": 2023 }, { "size": 7.0, "sailName": "Lahaina", "year": 2020 }, { "size": 7.8, "sailName": "Lahaina", "year": 2023 } ], "boards": [ { "boardName": "CompFoil170", "brand": "Wailea", "year": 2022 }, { "boardName": "Kanaha Custom", "brand": "Wailea", "year": 2019 } ] } }
In addition to letting other sailors know what kinds of equipment their competitors may be sailing on, it is mandatory that each sailor disclose the year of their most recent windsurfing board and full details on two of their sails. Note that all sailors are identified by a sail number that is printed on all their equipment. This mandatory information is specified via an array of JSON pointers as shown below.
["/credentialSubject/sailNumber", "/credentialSubject/sails/1", "/credentialSubject/boards/0/year", "/credentialSubject/sails/2"]
The result of applying the above JSON pointers to the sailor's equipment document is shown below.
[ { "pointer": "/sailNumber", "value": "Earth101" }, { "pointer": "/sails/1", "value": { "size": 6.1, "sailName": "Lahaina", "year": 2023 } }, { "pointer": "/boards/0/year", "value": 2022 }, { "pointer": "/sails/2", "value": { "size": 7, "sailName": "Lahaina", "year": 2020 } } ]
Transformation of the unsigned document begins with canonicalizing the document, as shown below.
[ "_:c14n0 <https://windsurf.grotto-networking.com/selective#boardName> \"CompFoil170\" .\n", "_:c14n0 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n", "_:c14n0 <https://windsurf.grotto-networking.com/selective#year> \"2022\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n1 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:c14n1 <https://windsurf.grotto-networking.com/selective#size> \"7.8E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n", "_:c14n1 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n2 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> .\n", "_:c14n2 <https://www.w3.org/2018/credentials#credentialSubject> _:c14n6 .\n", "_:c14n3 <https://windsurf.grotto-networking.com/selective#boardName> \"Kanaha Custom\" .\n", "_:c14n3 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n", "_:c14n3 <https://windsurf.grotto-networking.com/selective#year> \"2019\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n4 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:c14n4 <https://windsurf.grotto-networking.com/selective#size> \"7\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n4 <https://windsurf.grotto-networking.com/selective#year> \"2020\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n5 <https://windsurf.grotto-networking.com/selective#sailName> \"Kihei\" .\n", "_:c14n5 <https://windsurf.grotto-networking.com/selective#size> \"5.5E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n", "_:c14n5 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#boards> _:c14n0 .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#boards> _:c14n3 .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#sailNumber> \"Earth101\" .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#sails> _:c14n1 .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#sails> _:c14n4 .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#sails> _:c14n5 .\n", "_:c14n6 <https://windsurf.grotto-networking.com/selective#sails> _:c14n7 .\n", "_:c14n7 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:c14n7 <https://windsurf.grotto-networking.com/selective#size> \"6.1E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n", "_:c14n7 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ]
To prevent possible information leakage from the ordering of the blank node IDs these are processed through a PRF (i.e., the HMAC) to give the canonicalized HMAC document shown below. This represents an ordered list of statements that will be subject to mandatory and selective disclosure, i.e., it is from this list that statements are grouped.
[ "_:b0 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:b0 <https://windsurf.grotto-networking.com/selective#size> \"6.1E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n", "_:b0 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:b1 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:b1 <https://windsurf.grotto-networking.com/selective#size> \"7.8E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n", "_:b1 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:b2 <https://windsurf.grotto-networking.com/selective#boardName> \"CompFoil170\" .\n", "_:b2 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n", "_:b2 <https://windsurf.grotto-networking.com/selective#year> \"2022\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:b3 <https://windsurf.grotto-networking.com/selective#sailName> \"Kihei\" .\n", "_:b3 <https://windsurf.grotto-networking.com/selective#size> \"5.5E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n", "_:b3 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:b4 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> .\n", "_:b4 <https://www.w3.org/2018/credentials#credentialSubject> _:b6 .\n", "_:b5 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n", "_:b5 <https://windsurf.grotto-networking.com/selective#size> \"7\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:b5 <https://windsurf.grotto-networking.com/selective#year> \"2020\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n", "_:b6 <https://windsurf.grotto-networking.com/selective#boards> _:b2 .\n", "_:b6 <https://windsurf.grotto-networking.com/selective#boards> _:b7 .\n", "_:b6 <https://windsurf.grotto-networking.com/selective#sailNumber> \"Earth101\" .\n", "_:b6 <https://windsurf.grotto-networking.com/selective#sails> _:b0 .\n", "_:b6 <https://windsurf.grotto-networking.com/selective#sails> _:b1 .\n", "_:b6 <https://windsurf.grotto-networking.com/selective#sails> _:b3 .\n", "_:b6 <https://windsurf.grotto-networking.com/selective#sails> _:b5 .\n", "_:b7 <https://windsurf.grotto-networking.com/selective#boardName> \"Kanaha Custom\" .\n", "_:b7 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n", "_:b7 <https://windsurf.grotto-networking.com/selective#year> \"2019\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ]
The above canonical document gets grouped into mandatory and non-mandatory statements. The final output of the selective disclosure transformation process is shown below. Each statement is now grouped as mandatory or non-mandatory, and its index in the previous list of statements is remembered.
{ "mandatoryPointers": [ "/credentialSubject/sailNumber", "/credentialSubject/sails/1", "/credentialSubject/boards/0/year", "/credentialSubject/sails/2" ], "mandatory": { "dataType": "Map", "value": [ [ 0, "_:b0 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n" ], [ 1, "_:b0 <https://windsurf.grotto-networking.com/selective#size> \"6.1E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n" ], [ 2, "_:b0 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 8, "_:b2 <https://windsurf.grotto-networking.com/selective#year> \"2022\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 12, "_:b4 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> .\n" ], [ 13, "_:b4 <https://www.w3.org/2018/credentials#credentialSubject> _:b6 .\n" ], [ 14, "_:b5 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n" ], [ 15, "_:b5 <https://windsurf.grotto-networking.com/selective#size> \"7\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 16, "_:b5 <https://windsurf.grotto-networking.com/selective#year> \"2020\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 17, "_:b6 <https://windsurf.grotto-networking.com/selective#boards> _:b2 .\n" ], [ 19, "_:b6 <https://windsurf.grotto-networking.com/selective#sailNumber> \"Earth101\" .\n" ], [ 20, "_:b6 <https://windsurf.grotto-networking.com/selective#sails> _:b0 .\n" ], [ 23, "_:b6 <https://windsurf.grotto-networking.com/selective#sails> _:b5 .\n" ] ] }, "nonMandatory": { "dataType": "Map", "value": [ [ 3, "_:b1 <https://windsurf.grotto-networking.com/selective#sailName> \"Lahaina\" .\n" ], [ 4, "_:b1 <https://windsurf.grotto-networking.com/selective#size> \"7.8E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n" ], [ 5, "_:b1 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 6, "_:b2 <https://windsurf.grotto-networking.com/selective#boardName> \"CompFoil170\" .\n" ], [ 7, "_:b2 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n" ], [ 9, "_:b3 <https://windsurf.grotto-networking.com/selective#sailName> \"Kihei\" .\n" ], [ 10, "_:b3 <https://windsurf.grotto-networking.com/selective#size> \"5.5E0\"^^<http://www.w3.org/2001/XMLSchema#double> .\n" ], [ 11, "_:b3 <https://windsurf.grotto-networking.com/selective#year> \"2023\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ], [ 18, "_:b6 <https://windsurf.grotto-networking.com/selective#boards> _:b7 .\n" ], [ 21, "_:b6 <https://windsurf.grotto-networking.com/selective#sails> _:b1 .\n" ], [ 22, "_:b6 <https://windsurf.grotto-networking.com/selective#sails> _:b3 .\n" ], [ 24, "_:b7 <https://windsurf.grotto-networking.com/selective#boardName> \"Kanaha Custom\" .\n" ], [ 25, "_:b7 <https://windsurf.grotto-networking.com/selective#brand> \"Wailea\" .\n" ], [ 26, "_:b7 <https://windsurf.grotto-networking.com/selective#year> \"2019\"^^<http://www.w3.org/2001/XMLSchema#integer> .\n" ] ] }, "hmacKeyString": "00112233445566778899AABBCCDDEEFF00112233445566778899AABBCCDDEEFF" }
The next step is to create the base proof configuration and canonicalize it. This is shown in the following two examples.
{ "type": "DataIntegrityProof", "cryptosuite": "bbs-2023", "created": "2023-08-15T23:36:38Z", "verificationMethod": "did:key:zUC7DerdEmfZ8f4pFajXgGwJoMkV1ofMTmEG5UoNvnWiPiLuGKNeqgRpLH2TV4Xe5mJ2cXV76gRN7LFQwapF1VFu6x2yrr5ci1mXqC1WNUrnHnLgvfZfMH7h6xP6qsf9EKRQrPQ#zUC7DerdEmfZ8f4pFajXgGwJoMkV1ofMTmEG5UoNvnWiPiLuGKNeqgRpLH2TV4Xe5mJ2cXV76gRN7LFQwapF1VFu6x2yrr5ci1mXqC1WNUrnHnLgvfZfMH7h6xP6qsf9EKRQrPQ", "proofPurpose": "assertionMethod", "@context": [ "https://www.w3.org/ns/credentials/v2", { "@vocab": "https://windsurf.grotto-networking.com/selective#" } ] }
_:c14n0 <http://purl.org/dc/terms/created> "2023-08-15T23:36:38Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> . _:c14n0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://w3id.org/security#DataIntegrityProof> . _:c14n0 <https://w3id.org/security#cryptosuite> "bbs-2023" . _:c14n0 <https://w3id.org/security#proofPurpose> <https://w3id.org/security#assertionMethod> . _:c14n0 <https://w3id.org/security#verificationMethod> <did:key:zUC7DerdEmfZ8f4pFajXgGwJoMkV1ofMTmEG5UoNvnWiPiLuGKNeqgRpLH2TV4Xe5mJ2cXV76gRN7LFQwapF1VFu6x2yrr5ci1mXqC1WNUrnHnLgvfZfMH7h6xP6qsf9EKRQrPQ#zUC7DerdEmfZ8f4pFajXgGwJoMkV1ofMTmEG5UoNvnWiPiLuGKNeqgRpLH2TV4Xe5mJ2cXV76gRN7LFQwapF1VFu6x2yrr5ci1mXqC1WNUrnHnLgvfZfMH7h6xP6qsf9EKRQrPQ> .
In the hashing step, we compute the SHAKE-256 hash of the canonicalized proof
options to produce the proofHash
, and we compute the SHAKE-256 hash of the
join of all the mandatory N-Quads to produce the mandatoryHash
. These are
shown below in hexadecimal format.
{ "proofHash": "109514ed8101a836d240819e30630f48639bf7f1f247074e928eaad99e5775d4", "mandatoryHash": "e8bf46bff3db96eabc3a9410795dc94bc3537165e082f4a3e58841982fd7d4b3" }
Shown below are the computed bbsSignature
in hexadecimal, and the
mandatoryPointers
. These are are fed to the final serialization step with the
hmacKey
.
{ "bbsSignature": "93c7abe23fdf4856654bc858e607b7659af82b564340731454884724ec01e25360ac49e39cf0df7631535373042caed256abed6e81884e71a21590fef8dbe07e177dcedd8cfe94e4574c4ab51a22bdf9", "mandatoryPointers": [ "/credentialSubject/sailNumber", "/credentialSubject/sails/1", "/credentialSubject/boards/0/year", "/credentialSubject/sails/2" ] }
Finally, the values above are run through the algorithm of Section
3.2.1 serializeBaseProofValue, to produce the proofValue
which is
used in the signed base document shown below.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", { "@vocab": "https://windsurf.grotto-networking.com/selective#" } ], "type": [ "VerifiableCredential" ], "credentialSubject": { "sailNumber": "Earth101", "sails": [ { "size": 5.5, "sailName": "Kihei", "year": 2023 }, { "size": 6.1, "sailName": "Lahaina", "year": 2023 }, { "size": 7, "sailName": "Lahaina", "year": 2020 }, { "size": 7.8, "sailName": "Lahaina", "year": 2023 } ], "boards": [ { "boardName": "CompFoil170", "brand": "Wailea", "year": 2022 }, { "boardName": "Kanaha Custom", "brand": "Wailea", "year": 2019 } ] }, "proof": { "type": "DataIntegrityProof", "cryptosuite": "bbs-2023", "created": "2023-08-15T23:36:38Z", "verificationMethod": "did:key:zUC7DerdEmfZ8f4pFajXgGwJoMkV1ofMTmEG5UoNvnWiPiLuGKNeqgRpLH2TV4Xe5mJ2cXV76gRN7LFQwapF1VFu6x2yrr5ci1mXqC1WNUrnHnLgvfZfMH7h6xP6qsf9EKRQrPQ#zUC7DerdEmfZ8f4pFajXgGwJoMkV1ofMTmEG5UoNvnWiPiLuGKNeqgRpLH2TV4Xe5mJ2cXV76gRN7LFQwapF1VFu6x2yrr5ci1mXqC1WNUrnHnLgvfZfMH7h6xP6qsf9EKRQrPQ", "proofPurpose": "assertionMethod", "proofValue": "u2V0Cg9hAWFCTx6viP99IVmVLyFjmB7dlmvgrVkNAcxRUiEck7AHiU2CsSeOc8N92MVNTcwQsrtJWq-1ugYhOcaIVkP742-B-F33O3Yz-lORXTEq1GiK9-dhAWCAAESIzRFVmd4iZqrvM3e7_ABEiM0RVZneImaq7zN3u_4R4HS9jcmVkZW50aWFsU3ViamVjdC9zYWlsTnVtYmVyeBovY3JlZGVudGlhbFN1YmplY3Qvc2FpbHMvMXggL2NyZWRlbnRpYWxTdWJqZWN0L2JvYXJkcy8wL3llYXJ4Gi9jcmVkZW50aWFsU3ViamVjdC9zYWlscy8y" } }
To create a derived proof, a holder starts with a signed document
containing a base proof. The base document we will use for these test vectors is
the final example from Section A.1 Base Proof, above. The first
step is to run the algorithm of Section 3.2.2 parseBaseProofValue to
recover bbsSignature
, hmacKey
, and mandatoryPointers
, as shown below.
{ "bbsSignature": "93c7abe23fdf4856654bc858e607b7659af82b564340731454884724ec01e25360ac49e39cf0df7631535373042caed256abed6e81884e71a21590fef8dbe07e177dcedd8cfe94e4574c4ab51a22bdf9", "hmacKey": "00112233445566778899aabbccddeeff00112233445566778899aabbccddeeff", "mandatoryPointers": [ "/credentialSubject/sailNumber", "/credentialSubject/sails/1", "/credentialSubject/boards/0/year", "/credentialSubject/sails/2" ] }
Next, the holder needs to indicate what else, if anything, they wish to reveal to the verifiers, by specifying JSON pointers for selective disclosure. In our windsurfing competition scenario, a sailor (the holder) has just completed their first day of racing, and wishes to reveal to the general public (the verifiers) all the details of the windsurfing boards they used in the competition. These are shown below. Note that this slightly overlaps with the mandatory disclosed information which included only the year of their most recent board.
["/credentialSubject/boards/0", "/credentialSubject/boards/1"]
To produce the revealDocument
(i.e., the unsigned document that will
eventually be signed and sent to the verifier), we append the selective pointers
to the mandatory pointers, and input these combined pointers along with the
document without proof to the selectJsonLd
algorithm of [DI-ECDSA],
to get the result shown below.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", { "@vocab": "https://windsurf.grotto-networking.com/selective#" } ], "type": [ "VerifiableCredential" ], "credentialSubject": { "sailNumber": "Earth101", "sails": [ { "size": 6.1, "sailName": "Lahaina", "year": 2023 }, { "size": 7, "sailName": "Lahaina", "year": 2020 } ], "boards": [ { "year": 2022, "boardName": "CompFoil170", "brand": "Wailea" }, { "boardName": "Kanaha Custom", "brand": "Wailea", "year": 2019 } ] } }
Now that we know what the revealed document looks like, we need to furnish appropriately updated information to the verifier about which statements are mandatory, and the indexes for the selected non-mandatory statements. Running step 6 of the 3.2.3 createDisclosureData yields an abundance of information about various statement groups relative to the original document. Below we show a portion of the indexes for those groups.
{ "combinedIndexes": [0, 1, 2, 6, 7, 8, 12, 13, 14, 15, 16, 17, 18, 19, 20, 23, 24, 25, 26], "mandatoryIndexes": [0, 1, 2, 8, 12, 13, 14, 15, 16, 17, 19, 20, 23 ], "nonMandatoryIndexes": [3, 4, 5, 6, 7, 9, 10, 11, 18, 21, 22, 24, 25, 26], "selectiveIndexes": [6, 7, 8, 12, 13, 17, 18, 24, 25, 26] }
The verifier needs to be able to aggregate and hash the mandatory statements. To
enable this, we furnish them with a list of indexes of the mandatory statements
adjusted to their positions in the reveal document (i.e., relative to the
combinedIndexes
), while the selectiveIndexes
need to be adjusted relative to
their positions within the nonMandatoryIndexes
. These "adjusted" indexes are
shown below.
{ "adjMandatoryIndexes":[0,1,2,5,6,7,8,9,10,11,13,14,15], "adjSelectiveIndexes":[3,4,8,11,12,13] }
The last important piece of disclosure data is a mapping of canonical blank node
IDs to HMAC-based shuffled IDs, the labelMap
, computed according to Section
3.2.3 createDisclosureData. This is shown below along with
the rest of the disclosure data minus the reveal document.
{ "bbsProof":"b29c719aba8103c713c5facba9b690930ad458816645adc1a53b251010bc3b128d72580239f66ff4e9739e28425794e881b5737fb3abce02b2655d4fb3babebd515685ce7567eab5bd01360e8131150576357509db309294569d822d56e1c581420a8af29b7c7984d50fd5c79a06d64a2586da8a24e93c3742d09f2c0e24d7fe4891927c7ffe408d563a64f586737867a1f020f742fc6eaa1d37eda426c9c75566de8be54822f69749fc462c86caaaf4f9f73ee1b08726f378432e382322a3cc0e87d5b23fc36364bc5c94cfb8a305be6f912bd7152e7a48d4d41571c653d58e5fea8a8238e05aea910e5b62c9d15b8d527c0d59f619fbab6a8799b1ce1da13c6516c23eefc03b247672878c34949943e02f4b3991139276c89a00c4ee64bbce570201ac3502fb4769e6b869919320ad9f3121dfeeecdb2914cfc7d4a386b6153f54b18b4148742ec7b66c81cff0b1de88d2d299f35f2ff817fb422fe0bbf65b5cd7deb939a10cc524f08eff46f31b5631afbd0551d9816e32fb2e4bb7214ce76136057c1298e2a161b5ec3280f0530130ab9600426c7e521d1b893850ae83cf4f211987c93f3a41c16b0cbac29e5dcf88eb65892518f643d5c2acd4888045d4", "labelMap":{"dataType":"Map", "value":[["c14n0","b2"],["c14n1","b4"],["c14n2","b7"],["c14n3","b6"],["c14n4","b5"],["c14n5","b0"]] }, "mandatoryIndexes":[0,1,2,5,6,7,8,9,10,11,13,14,15], "adjSelectiveIndexes":[3,4,8,11,12,13] }
Finally, using the disclosure data above with the algorithm of Section 3.2.6 serializeDerivedProofValue, we obtain the signed derived (reveal) document shown below.
{ "@context": [ "https://www.w3.org/ns/credentials/v2", { "@vocab": "https://windsurf.grotto-networking.com/selective#" } ], "type": [ "VerifiableCredential" ], "credentialSubject": { "sailNumber": "Earth101", "sails": [ { "size": 6.1, "sailName": "Lahaina", "year": 2023 }, { "size": 7, "sailName": "Lahaina", "year": 2020 } ], "boards": [ { "year": 2022, "boardName": "CompFoil170", "brand": "Wailea" }, { "boardName": "Kanaha Custom", "brand": "Wailea", "year": 2019 } ] }, "proof": { "type": "DataIntegrityProof", "cryptosuite": "bbs-2023", "created": "2023-08-15T23:36:38Z", "verificationMethod": "did:key:zUC7DerdEmfZ8f4pFajXgGwJoMkV1ofMTmEG5UoNvnWiPiLuGKNeqgRpLH2TV4Xe5mJ2cXV76gRN7LFQwapF1VFu6x2yrr5ci1mXqC1WNUrnHnLgvfZfMH7h6xP6qsf9EKRQrPQ#zUC7DerdEmfZ8f4pFajXgGwJoMkV1ofMTmEG5UoNvnWiPiLuGKNeqgRpLH2TV4Xe5mJ2cXV76gRN7LFQwapF1VFu6x2yrr5ci1mXqC1WNUrnHnLgvfZfMH7h6xP6qsf9EKRQrPQ", "proofPurpose": "assertionMethod", "proofValue": "u2V0DhNhAWQHAspxxmrqBA8cTxfrLqbaQkwrUWIFmRa3BpTslEBC8OxKNclgCOfZv9OlznihCV5TogbVzf7OrzgKyZV1Ps7q-vVFWhc51Z-q1vQE2DoExFQV2NXUJ2zCSlFadgi1W4cWBQgqK8pt8eYTVD9XHmgbWSiWG2ook6Tw3QtCfLA4k1_5IkZJ8f_5AjVY6ZPWGc3hnofAg90L8bqodN-2kJsnHVWbei-VIIvaXSfxGLIbKqvT59z7hsIcm83hDLjgjIqPMDofVsj_DY2S8XJTPuKMFvm-RK9cVLnpI1NQVccZT1Y5f6oqCOOBa6pEOW2LJ0VuNUnwNWfYZ-6tqh5mxzh2hPGUWwj7vwDskdnKHjDSUmUPgL0s5kROSdsiaAMTuZLvOVwIBrDUC-0dp5rhpkZMgrZ8xId_u7NspFM_H1KOGthU_VLGLQUh0Lse2bIHP8LHeiNLSmfNfL_gX-0Iv4Lv2W1zX3rk5oQzFJPCO_0bzG1Yxr70FUdmBbjL7Lku3IUznYTYFfBKY4qFhtewygPBTATCrlgBCbH5SHRuJOFCug89PIRmHyT86QcFrDLrCnl3PiOtliSUY9kPVwqzUiIBF1KYAAgEEAgcDBgQFBQCNAAECBQYHCAkKCw0OD4YDBAgLDA0" } }
Portions of the work on this specification have been funded by the United States Department of Homeland Security's (US DHS) Silicon Valley Innovation Program under contracts 70RSAT20T00000003, and 70RSAT20T00000033. The content of this specification does not necessarily reflect the position or the policy of the U.S. Government and no official endorsement should be inferred.
Referenced in: