W3C

Results of Questionnaire W3C PROV Implementation Survey

The results of this questionnaire are available to anybody. In addition, answers are sent to the following email address: zednis@rpi.edu

This questionnaire was open from 2012-01-11 to 2013-04-16.

41 answers have been received.

Jump to results for question:

  1. Implementation Information
  2. Contact Information
  3. Implementation Type
  4. PROV Encodings Supported
  5. Feature Coverage
  6. Does this implementation support any constraints or inferences defined by the PROV-CONSTRAINTS?
  7. Framework Usage
  8. Provenance Exchange

1. Implementation Information

Details

Responder NameURLDescription
Danius Michaelides StatJR eBook system http://www.bristol.ac.uk/cmm/software/statjr/index.html The StatJR is a statistical modelling software package that is designed to be open and extensible. Its target audience is researchers in the social sciences and there is a strong pedagogical aspect to the software via the eBook interface. This interface makes heavy use of Prov to represent interactions and computation that occur as a user reads eBooks. Prov is also used to drive some of the functionality of the system such as cacheing executions and allowing the user to navigate to past versions of the eBook. The system is described in our paper in IPAW2012 - "DEEP: A Provenance-Aware Executable Document System".
Michael Jewell PoN http://tina.ecs.soton.ac.uk/djangopon/ PoN is a web application for collecting, organizing and browsing archeological research data and notes. The web application can be accessed through both computers and smart phones. All versions of artefacts are retained, allowing users to see how interpretations have developed.

PoN has been developed as part of the PATINA project, which aims to revolutionise the design of technologies for supporting research. The PATINA project was awarded by the Engineering and Physical Sciences Research Council (EPSRC) and the Arts and Humanities Research Council (AHRC) through the RCUK Digital Economy programme.
Daniel Garijo WingsProvenanceExport https://github.com/dgarijo/WingsProvenanceExport Provenance export for the Wings workflow system, using the OPMW vocabulary. OPMW extends PROV core concepts to represent scientific workflows. More information in :http://www.opmw.org/node/8

Input: owl ontologies and ttl files used to specify templates and summary of traces, respectively. Output: 2 files: one in PROV and one in OPM.

The application generates provenance. It does not consume it.
Stian Soiland-Reyes Taverna https://github.com/wf4ever/taverna-prov/ Plugin for Taverna workflow system for exporting workflow runs as PROV RDF. Provenance data includes information about start/stop of individual activities in a workflow definition, the workflow as a whole, and any nested workflows. Data items (entities) are identified as used and generated by the different steps.

The RDF is using the vocabularies PROV-O, wfprov and a customization of the two called tavernaprov (see separate vocabulary registration), saved as a single Turtle file together with file representations of input, output and intermediate values.
Trung Dong Huynh CollabMap http://www.collabmap.org/ CollabMap is a platform designed for crowdsourcing the task of identifying building evacuation routes to a large number of users, by offering them freely available data, such as satellite imagery (e.g. Google Maps), panoramic views (e.g. Google Streetview) and building shapes to carry out this task. The application tracks the provenance of its users for quality verification purposes.
Clément Caron WebLab-PROV http://weblab-project.org/index.php?title=WebLab-PROV Application of provenance on the WebLab platform, using the PROV ontology.
Luc Moreau ProvToolbox https://github.com/lucmoreau/ProvToolbox The ProvToolbox is a Java toolbox to create and convert W3C Provenance PROV representations between Java, RDF, XML, PROV-N, JSON, and dot.
Hook Hua Provenance for Earth Science tbd Extends PROV-O with a provenance representation for use in NASA's Earth Science Data Systems. Also assessing overlap of terms in ISO 19115-* Lineage.
Eric Stephan Provenance Environment (ProvEn) Services Coming soon ProvEn Services enable and facilitate scientific teams to publish, share, link, and discover knowledge about their scientific research results. In science, provenance is produced in many different manual and automated ways and is highly expressive. Scientific teams producing results need a means to provide a composite origin story of the dataset to future consumers while maintaining privacy. Proven Services provides Extract Translate Load services for users to provide native sources of provenance to build a composite history. It relies on PROV-O, Dublin Core and other foundational ontologies so that diverse scientific knowledge can be cross referenced.
Amir Sezavar Keshavarz Annotation Inference Framework http://users.ecs.soton.ac.uk/ask2g10/prov/ Annotation inference is a form of inference, that given a provenance graph with some annotations, infers new annotations for the same graph. Annotation is introduced as a generic mechanism to enable users to attach any information to the elements of a provenance graph for domain specific interpretation of provenance of data. This framework can be instantiated based on various needs.
Simon Miles PROVoKing https://sites.google.com/site/provokinglibrary/ A general Java library for supporting the use of W3C PROV in Java applications.
Olaf Hartig Triplify http://triplify.org/ Triplify is a small plugin for Web applications, which reveals the semantic structures encoded in relational databases by making database content available as Linked Data. The customizable metadata component that -by default- adds some provenance information to the published data, is documented here: http://sourceforge.net/apps/mediawiki/trdf/index.php?title=Triplify_Metadata_Extension
Paolo Missier Prov-gen https://github.com/PaoloMissier/ProvToolbox/tree/master/prov-gen a simple generator that produces a large PROV instance from a seed PROV instance, which can be defined using PROV-N or interactively through a simple GUI.
The generated PROV instance is encoded as PROV-N.
This is described in a M.Sc. dissertation: https://github.com/PaoloMissier/ProvToolbox/blob/master/prov-gen/WilliamMartin-MSc-Dissertation.pdf
Edoardo Pignotti OBIAMA (Ontology-Based Integrated Action Modelling Arena https://github.com/garypolhill/obiama A prototype discrete-event simulation environment that represents the state and structure of the model at any one time using OWL ontologies. OBIAMA is capable of generating provenance about actions performed by agents in a simulation model using PROV-O.
Jacco van Ossenbruggen Amalgame http://semanticweb.cs.vu.nl/amalgame/ Amalgame is a tool for finding, evaluating and managing vocabulary alignments interactively. Because the process is interactive, we need PROV to record exactly what workflow has been executed to be able to interpret the results later.
Olaf Hartig D2R Server http://d2rq.org/d2r-server D2R Server is a tool for publishing relational databases on the Semantic Web. It enables RDF and HTML browsers to navigate the content of the database, and allows querying the database using the SPARQL query language. The customizable metadata component that -by default- adds some provenance information to the published data, is documented here: http://sourceforge.net/apps/mediawiki/trdf/index.php?title=D2R_Server_Metadata_Extension
Trung Dong Huynh Provenance server https://provenance.ecs.soton.ac.uk/store The Provenance Server is a web service that allows storing, browsing and managing provenance documents. The server can be accessed via a Web interface or via a REST API (using an API key or OAuth authentication)
Trung Dong Huynh agentSwitch http://hac.ecs.soton.ac.uk/agentswitch/ agentSwitch is a personalized energy tariff-recommender system which analyses electricity consumption and offer recommendations on:
- the best tariffs to save electricity bills
- shifting usages to night-time to benefit from lower electricity rate
Reza B'Far Oracle Enterprise Transactions Controls Governor 8.6.4 https://updates.oracle.com/Orion/Services/download/p14786779_864_Linux-x86-64.zip?aru=15596267&patch_file=p14786779_864_Linux-x86-64.zip Oracle Enterprise Transactions Controls Governor 8.6.4
Jun Zhao Pubby http://wifo5-03.informatik.uni-mannheim.de/pubby/ Pubby can be used to add Linked Data interfaces to SPARQL endpoints. The customizable metadata component that -by default- adds some provenance information to the published data, is documented here: http://sourceforge.net/apps/mediawiki/trdf/index.php?title=Pubby_Metadata_Extension
Satya Sahoo Semantic Proteomics Dashboard (SemPoD) http://physiomimi.case.edu/sempod/index.php/Main_Page The SemPoD platform, currently in use at the Case Center for Proteomics and Bioinformatics (CPB), extends the PROV Ontology (PROV-O) to support provenance–aware querying of 1153 mass-spectrometry experiments from 20 different projects. SemPoD consists of three components: Ontology-driven Visual Query Composer, Result Explorer, and Query Manager. SemPoD includes a dynamic query composition interface, which automatically updates the components of the query interface based on previous user selections and efficiently prunes the result set using a “smart filtering” approach based on the provenance of the results.

SemPoD Webpage: http://physiomimi.case.edu/sempod/index.php/Main_Page
Mohamed Morsey DeFacto http://defacto.aksw.org DeFacto (Deep Fact Validation) is an algorithm for validating statements by finding confirming sources for it on the web. It takes a statement (such as “Jamaica Inn was directed by Alfred Hitchcock”) as input and then tries to find evidence for the truth of that statement by searching for information in the web.
Chris Baillie Quality Assessment Framework https://github.com/cbaillie/QualityAssessmentFramework This framework enables quality assessment of a given (RDF) entity based on user-defined requirements. Provenance described using the PROV-O ontology can be included as part of the quality requirements rules which are described using SPIN-SPARQL rules. At present quality requirements are based on the Data Quality Management ontology (http://semwebquality.org/dqm-vocabulary/v1/dqm). This framework can capture the provenance of the quality assesment process using PROV-O so that future quality assessments can make decisions about re-using existing quality scores.
Stephan Zednik Global Change Information System - Information Model and Semantic Application Prototypes http://tw.rpi.edu/web/project/gcis-imsap The Tetherless World Constellation (TWC) at Rensselaer Polytechnic Institute (RPI) proposes to facilitate the vocabulary and ontology development within the context of the overall development of semantic prototypes for the National Climate Assessment (NCA) portals using a combination of environmental inter-agency collaborations in a use-case focused workshop setting, information modeling, and software developments and deployments. The prototypes are intended to provide search and browse options that inspire confidence that all relevant information has been found; data providers will be citable with detailed provenance generation. Expected deliverables are: information models, vocabulary and ontology services for vetted climate assessment settings, and search/ browse prototypes.
Landong Zuo OpenUp Prov TSO's implementation of PROV provides a object-oriented model for non-semantic developers to generate Provenance RDF in compliance with PROV specification.

The Java-based tool has an integrated Apache Jena 2.7.2 to support RDF serialisation. It can have a triple-store backend, where provenance trail can be published in a progressive manner.

To do list: digital signature of provenance, PROV validation
Tom De Nies APROVeD: Automatic Provenance Derivation http://users.ugent.be/~tdenies/APROVeD/ In this project, we develop a new approach for automatic derivation of provenance, at multiple levels of granularity. To accomplish this, we detect entity derivations, relying on clustering algorithms, linked data and semantic similarity. The resulting derivations are structured in compliance with the Provenance Data Model (PROV-DM). While the proposed approach is purposely kept general, allowing adaptation in many use cases, we provide an implementation for one of these use cases, namely discovering the sources of news articles.
Rinke Hoekstra Raw2LD http://github.com/Data2Semantics/raw2ld Conversion scripts for converting Adverse Events reports to RDF. Uses PROV to describe a provenance trail of all conversion steps
Paolo Missier PROV-N to Neo4J DB mapping https://github.com/PaoloMissier/PROV_neo4j a simple Java program to upload a PROV-N encoded PROV instance into a Neo4J database.
This core implementation only covers a few basic relations. More should be added.

Built on the older 0.0.1-SNAPSHOT from ProvToolbox (see in the implementation section). Needs updating.
As this was part of a grad student project in 2012, fresh resources are needed to upgrade + complete the implementation.
Peter Slaughter Earth System Science Server http://es3.eri.ucsb.edu The Earth System Science Server (ES3) is a software environment for data-intensive Earth science. ES3 has unique capabilities for automatically and transparently capturing, managing and reconstructing the provenance of arbitrary, unmodified computational sequences.
David Corsar prov-api https://github.com/dcorsar/prov-api/ Java API for creating and manipulating provenance graphs. The API currently only implements the PROV core terms. We have two implementations of the API: one using Jena and one based on SPARQL v1.1. The Jena implementation supports building and querying a provenance model using an in memory ontology model. The SPARQL implementation builds a provenance graph by generating a series of SPARQL updates, which can then be used against an appropriate resource (e.g. Jena model, SPARQL v1.1 endpoint); querying is performed by generating SPARQL queries which are executed using a provided query engine.
Edoardo Pignotti Policy Reasoning Framework https://github.com/epignotti/PolicyReasoner A policy reasoning framework based on OWL and the SPIN-SPARQL reasoner. This framework infers the provenance of the policy reasoning process using SPARQL rules. The inferred provenance is represented using PROV-O.
David Corsar Informed Rural Passenger Information Infrastructure http://www.dotrural.ac.uk/irp/index.php?page=software As part of the Informed Rural Passenger project, we are developing a real time passenger information system supported by a framework of web services that integrate and use data from various sources (the crowd, open data providers, etc.). Prov-O is used to capture the provenance of various entities within the framework, for the purpose of supporting assessments of data quality and trustworthiness; see http://www.dotrural.ac.uk/irp/index.php?page=software for more details.
Peer Brauer PubFlow Provenance Archive www.pubflow.de/en/provenanceArchive The PubFlow provenance archive is used to store the provenance information collected by the PubFlow research data publication framework. PubFlow is a framework to transfer research data from a local repository to public available archives. Whenever PubFlow works on the research data the corresponding provenance information is collected and stored in the provenance archive.
Trung Dong Huynh PROV Python library http://pypi.python.org/pypi/prov The library provides an implementation of the PROV Data Model in Python. It contains a number of sub-modules:
- prov.model: In-memory classes for PROV assertions, JSON serialisation and deserialisation, PROV-N serialisation.
- prov.persistence: A Django app for storing and loading ProvBundle instances to/from databases using Django ORM
- prov.tracking: a logging-like module to facilitate tracking provenance in Python programs
Timothy Lebo csv2rdf4lod-automation https://github.com/timrdf/csv2rdf4lod-automation/wiki csv2rdf4lod is designed to aggregate and integrate multiple versions of multiple datasets of multiple source organizations in an incremental and backward-compatible way.
Sara Magliacane recoprov https://bitbucket.org/saramagliacane/recoprov/ A prototype implementation that tries to reconstruct the provenance of documents in a shared folder, mostly by using multi-modal similarities and metadata as evidence of dependencies among documents.
Timothy Lebo DataFAQs https://github.com/timrdf/DataFAQs/wiki An automated asynchronous Linked Data quality evaluation framework.
Timothy Lebo provx2o https://github.com/timrdf/provenanceweb/blob/master/src/provx2o.xsl GRDDL XSL transformation from PROV-XML to PROV-O.
Ashley Smith Hedgehog https://github.com/cgutteridge/Hedgehog Extendible software that creates, converts and publishes datasets in RDF.
Spyros Kotoulas QuerioCity research prototype QuerioCity manages the information of a city. It captures and exposes provenance with regard to the origin of information and various transformations of the data.
Chester Chen Tinga Provenance Service http://www.tingatech.com/ Tinga provenance service is part of the Tinga financial analytic platform. It provides provenance service for financial data, apps and related information

2. Contact Information

Details

Responder NameEmail
Danius Michaelides Danius Michaelides dtm@ecs.soton.ac.uk
Michael Jewell Michael Jewell mjewell@gmail.com
Daniel Garijo Daniel Garijo dgarijo@fi.upm.es
Stian Soiland-Reyes Stian Soiland-Reyes soiland-reyes@cs.manchester.ac.uk
Trung Dong Huynh Trung Dong Huynh dong.huynh@soton.ac.uk
Clément Caron Clément Caron caron.clement@gmail.com
Luc Moreau Luc Moreau l.moreau@ecs.soton.ac.uk
Hook Hua Hook Hua hook.hua@jpl.nasa.gov
Eric Stephan Eric Stephan eric.stephan@pnl.gov
Amir Sezavar Keshavarz Amir Sezavar Keshavarz amir.keshavarz@gmail.com
Simon Miles Simon Miles simon.miles@kcl.ac.uk
Olaf Hartig Olaf Hartig ohartig@uwaterloo.ca
Paolo Missier Paolo Missier Paolo.Missier@ncl.ac.uk
Edoardo Pignotti Gary Polhill gary.polhill@hutton.ac.uk
Jacco van Ossenbruggen Jacco van Ossenbruggen Jacco van Ossenbruggen <Jacco.van.Ossenbruggen@cwi.nl>
Olaf Hartig Olaf Hartig ohartig@uwaterloo.ca
Trung Dong Huynh Trung Dong Huynh donght@soton.ac.uk
Trung Dong Huynh Trung Dong Huynh tdh@ecs.soton.ac.uk
Reza B'Far Reza BFar reza.bfar@oracle.com
Jun Zhao Olaf Hartig/Jun Zhao ohartig@uwaterloo.ca/jun.zhao@zoo.ox.ac.uk
Satya Sahoo Catherine Jayapandian, Satya Sahoo sempodcwru@googlegroups.com
Mohamed Morsey Jens Lehmann lehmann@informatik.uni-leipzig.de
Chris Baillie Chris Baillie c.baillie@abdn.ac.uk
Stephan Zednik Stephan Zednik zednis@rpi.edu
Landong Zuo Landong Zuo landong.zuo@tso.co.uk
Tom De Nies Tom De Nies tom.denies@ugent.be
Rinke Hoekstra Rinke Hoekstra rinke.hoekstra@vu.nl
Paolo Missier Paolo Missier Paolo.Missier@ncl.ac.uk
Peter Slaughter Peter Slaughter peter@bren.ucsb.edu
David Corsar David Corsar, Christopher Baillie dcorsar@abdn.ac.uk, c.baillie@abdn.ac.uk
Edoardo Pignotti Edoardo Pignotti csc256@abdn.ac.uk
David Corsar David Corsar csc316@abdn.ac.uk
Peer Brauer Peer Brauer pcb@informatik.uni-kiel.de
Trung Dong Huynh Trung Dong Huynh tdh@ecs.soton.ac.uk
Timothy Lebo Timothy Lebo lebot@rpi.edu
Sara Magliacane Sara Magliacane s.magliacane@vu.nl
Timothy Lebo Timothy Lebo lebot@rpi.edu
Timothy Lebo Timothy Lebo lebot@rpi.edu
Ashley Smith Ashley Smith ads04r@ecs.soton.ac.uk
Spyros Kotoulas Spyros Kotoulas spyros.kotoulas@ie.ibm.com
Chester Chen Chester Chen chester@Tingatech.com

3. Implementation Type

summary | by responder | by choice

Choose all that apply

Summary

ChoiceAll responders
Results
Application (consumes / generates provenance) 30
Framework / API 10
Service 10

Skip to view by choice.

View by responder

Details

Responder Implementation Type
Danius Michaelides
  • Application (consumes / generates provenance)
Michael Jewell
  • Application (consumes / generates provenance)
Daniel Garijo
  • Application (consumes / generates provenance)
Stian Soiland-Reyes
  • Application (consumes / generates provenance)
Trung Dong Huynh
  • Application (consumes / generates provenance)
Clément Caron
  • Application (consumes / generates provenance)
Luc Moreau
  • Framework / API
  • Service
Hook Hua
  • Application (consumes / generates provenance)
  • Service
Eric Stephan
  • Application (consumes / generates provenance)
  • Service
Amir Sezavar Keshavarz
  • Application (consumes / generates provenance)
  • Framework / API
Simon Miles
  • Framework / API
Olaf Hartig
  • Service
Paolo Missier
  • Application (consumes / generates provenance)
Edoardo Pignotti
  • Application (consumes / generates provenance)
Jacco van Ossenbruggen
  • Application (consumes / generates provenance)
  • Framework / API
  • Service
Olaf Hartig
  • Service
Trung Dong Huynh
  • Service
Trung Dong Huynh
  • Application (consumes / generates provenance)
Reza B'Far
  • Application (consumes / generates provenance)
Jun Zhao
  • Service
Satya Sahoo
  • Application (consumes / generates provenance)
Mohamed Morsey
  • Application (consumes / generates provenance)
Chris Baillie
  • Framework / API
Stephan Zednik
  • Application (consumes / generates provenance)
Landong Zuo
  • Application (consumes / generates provenance)
Tom De Nies
  • Application (consumes / generates provenance)
Rinke Hoekstra
  • Application (consumes / generates provenance)
Paolo Missier
  • Application (consumes / generates provenance)
Peter Slaughter
  • Application (consumes / generates provenance)
David Corsar
  • Framework / API
Edoardo Pignotti
  • Framework / API
David Corsar
  • Application (consumes / generates provenance)
Peer Brauer
  • Application (consumes / generates provenance)
  • Framework / API
Trung Dong Huynh
  • Framework / API
Timothy Lebo
  • Application (consumes / generates provenance)
Sara Magliacane
  • Application (consumes / generates provenance)
Timothy Lebo
  • Application (consumes / generates provenance)
Timothy Lebo
  • Application (consumes / generates provenance)
Ashley Smith
  • Application (consumes / generates provenance)
Spyros Kotoulas
  • Application (consumes / generates provenance)
  • Framework / API
  • Service
Chester Chen
  • Service

View by choice

ChoiceResponders
Application (consumes / generates provenance)
  • Danius Michaelides
  • Michael Jewell
  • Daniel Garijo
  • Stian Soiland-Reyes
  • Trung Dong Huynh
  • Clément Caron
  • Hook Hua
  • Eric Stephan
  • Amir Sezavar Keshavarz
  • Paolo Missier
  • Edoardo Pignotti
  • Jacco van Ossenbruggen
  • Trung Dong Huynh
  • Reza B'Far
  • Satya Sahoo
  • Mohamed Morsey
  • Stephan Zednik
  • Landong Zuo
  • Tom De Nies
  • Rinke Hoekstra
  • Paolo Missier
  • Peter Slaughter
  • David Corsar
  • Peer Brauer
  • Timothy Lebo
  • Sara Magliacane
  • Timothy Lebo
  • Timothy Lebo
  • Ashley Smith
  • Spyros Kotoulas
Framework / API
  • Luc Moreau
  • Amir Sezavar Keshavarz
  • Simon Miles
  • Jacco van Ossenbruggen
  • Chris Baillie
  • David Corsar
  • Edoardo Pignotti
  • Peer Brauer
  • Trung Dong Huynh
  • Spyros Kotoulas
Service
  • Luc Moreau
  • Hook Hua
  • Eric Stephan
  • Olaf Hartig
  • Jacco van Ossenbruggen
  • Olaf Hartig
  • Trung Dong Huynh
  • Jun Zhao
  • Spyros Kotoulas
  • Chester Chen

4. PROV Encodings Supported

summary | by responder | by choice

Choose all that apply

Summary

ChoiceAll responders
Results
PROV-O 31
PROV-N 11
PROV-XML 7

Skip to view by choice.

View by responder

Details

Responder PROV Encodings SupportedPlease list any additional supported encodings (e.g. PROV-JSON, PROV-CSV, etc.) in the free-text area below
Danius Michaelides
  • PROV-O
PROV-JSON
Michael Jewell
  • PROV-N
PROV-JSON, PROV-N (and RDF serialization in future) via the provpy module.
Daniel Garijo
  • PROV-O
Stian Soiland-Reyes
  • PROV-O
Trung Dong Huynh PROV-JSON
Clément Caron
  • PROV-O
  • PROV-N
Luc Moreau
  • PROV-O
  • PROV-N
  • PROV-XML
prov-json
Hook Hua
  • PROV-O
Eric Stephan
  • PROV-O
Amir Sezavar Keshavarz
  • PROV-O
  • PROV-N
  • PROV-XML
PROV-JSON
Simon Miles
  • PROV-O
Olaf Hartig
  • PROV-O
Paolo Missier
  • PROV-N
the author implemented an intermediate object model to represent PROV instances, called PROV-Java. Available here:
https://github.com/PaoloMissier/ProvToolbox/tree/master/prov-java
Edoardo Pignotti
  • PROV-O
Jacco van Ossenbruggen
  • PROV-O
Olaf Hartig
  • PROV-O
Trung Dong Huynh
  • PROV-N
PROV-JSON
Trung Dong Huynh
  • PROV-N
PROV-JSON
Reza B'Far
  • PROV-O
  • PROV-XML
Jun Zhao
  • PROV-O
Satya Sahoo
  • PROV-O
Mohamed Morsey
  • PROV-O
Chris Baillie
  • PROV-O
Stephan Zednik
  • PROV-O
Landong Zuo
  • PROV-O
Tom De Nies
  • PROV-N
PROV-JSON
Rinke Hoekstra
  • PROV-O
Paolo Missier
  • PROV-N
Maps to a graph schema for the Neo4J DB.
Peter Slaughter
  • PROV-XML
PROV-JSON
David Corsar
  • PROV-O
Edoardo Pignotti
  • PROV-O
David Corsar
  • PROV-O
Peer Brauer
  • PROV-O
  • PROV-XML
Trung Dong Huynh
  • PROV-N
PROV-JSON
Timothy Lebo
  • PROV-O
Sara Magliacane
  • PROV-O
  • PROV-N
Timothy Lebo
  • PROV-O
Timothy Lebo
  • PROV-O
  • PROV-XML
Ashley Smith
  • PROV-XML
Spyros Kotoulas
  • PROV-O
Chester Chen
  • PROV-O
PROV-JSON

View by choice

ChoiceResponders
PROV-O
  • Danius Michaelides
  • Daniel Garijo
  • Stian Soiland-Reyes
  • Clément Caron
  • Luc Moreau
  • Hook Hua
  • Eric Stephan
  • Amir Sezavar Keshavarz
  • Simon Miles
  • Olaf Hartig
  • Edoardo Pignotti
  • Jacco van Ossenbruggen
  • Olaf Hartig
  • Reza B'Far
  • Jun Zhao
  • Satya Sahoo
  • Mohamed Morsey
  • Chris Baillie
  • Stephan Zednik
  • Landong Zuo
  • Rinke Hoekstra
  • David Corsar
  • Edoardo Pignotti
  • David Corsar
  • Peer Brauer
  • Timothy Lebo
  • Sara Magliacane
  • Timothy Lebo
  • Timothy Lebo
  • Spyros Kotoulas
  • Chester Chen
PROV-N
  • Michael Jewell
  • Clément Caron
  • Luc Moreau
  • Amir Sezavar Keshavarz
  • Paolo Missier
  • Trung Dong Huynh
  • Trung Dong Huynh
  • Tom De Nies
  • Paolo Missier
  • Trung Dong Huynh
  • Sara Magliacane
PROV-XML
  • Luc Moreau
  • Amir Sezavar Keshavarz
  • Reza B'Far
  • Peter Slaughter
  • Peer Brauer
  • Timothy Lebo
  • Ashley Smith

5. Feature Coverage

Indicate covered features by selecting one of the following below:

  • 1) I Don't Know
  • 2) No Support
  • 3) Read-only Support
  • 4) Write-only Support
  • 5) Read and Write Support

Summary

ChoiceAll responders
12345No opinion
Entity 2 18 20 1
Activity 3 18 18 2
Agent 1 2 18 18 2
Generation 3 3 19 14 2
Usage 2 3 20 14 2
Communication 1 19 2 4 7 8
Derivation 2 6 2 13 11 7
Attribution 1 12 1 9 10 8
Association 4 2 18 10 7
Delegation 1 13 1 6 8 12
Start 2 12 2 6 11 8
End 3 10 2 7 10 9
Invalidation 1 21 1 7 11
Revision 2 15 2 3 7 12
Quotation 3 18 1 6 13
PrimarySource 2 16 2 1 8 12
Person 12 2 6 10 11
Organization 16 1 2 10 12
SoftwareAgent 12 2 8 9 10
Plan 2 11 3 11 6 8
Influence 19 1 3 5 13
Bundle 2 13 1 6 8 11
Specialization 1 17 2 3 7 11
Alternate 1 17 1 3 6 13
Collection 1 17 2 7 14
EmptyCollection 1 19 1 5 15
Membership 1 17 1 8 14
Identifier 14 1 6 8 12
Attributes 1 11 2 8 8 11
Label 13 1 5 8 14
Location 2 17 1 3 6 12
Role 2 17 2 2 6 12
Type 2 10 1 8 9 11
Value 1 11 1 7 8 13

Averages:

Choices All responders:
Value
Entity4.45
Activity4.38
Agent4.36
Generation4.13
Usage4.18
Communication2.91
Derivation3.74
Attribution3.45
Association4.00
Delegation3.24
Start3.36
End3.34
Invalidation2.70
Revision2.93
Quotation2.57
PrimarySource2.90
Person3.47
Organization3.21
SoftwareAgent3.45
Plan3.24
Influence2.79
Bundle3.17
Specialization2.93
Alternate2.86
Collection2.89
EmptyCollection2.62
Membership2.93
Identifier3.28
Attributes3.37
Label3.30
Location 2.79
Role2.76
Type3.40
Value3.36

Details

Responder EntityActivityAgentGenerationUsageCommunicationDerivationAttributionAssociationDelegationStartEndInvalidationRevisionQuotationPrimarySourcePersonOrganizationSoftwareAgentPlanInfluenceBundleSpecializationAlternateCollectionEmptyCollectionMembershipIdentifierAttributesLabelLocation RoleTypeValueRationale
Danius Michaelides 5 5 2 5 5 2 4 2 2 2 5 5 2 2 2 2 2 2 2 2 2 5 2 2 2 2 2 5 5 2 2 2 5 5
Michael Jewell 4 4 4 4 4 2 4 4 4 2 4 4 2 4 2 2 4 2 4 2 2 4 4 4 2 2 2 4 4 4 2 2 4 4
Daniel Garijo 4 4 4 4 4 2 2 4 4 2 2 2 2 2 2 4 2 2 2 4 2 4 2 2 2 2 2 2 2 4 4 2 2 4 The tool makes use of the binary relationships. Classes Use, Generation and Association are not used, but used, wasGeneratedBy and wasAssociatedWith are.
Stian Soiland-Reyes 4 4 4 4 4 4 2 2 4 2 4 4 2 2 2 2 2 2 4 4 2 4 2 2 4 4 4 4 4 4 2 4 2 2
Trung Dong Huynh 4 4 4 4 4 2 4 2 4 2 2 2 2 4 2 2 4 2 2 4 2 2 2 2 2 2 2 4 4 2 2 2 4 4
Clément Caron 5 5 5 5 5 2 5 2 5 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
Luc Moreau 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 The toolbox is capable of reading rdf/xml/provn/json serializations and saving them. It can also convert a prov representation to dot. Command line executable provconvert is build from the source code, and a service is also deployed.
Hook Hua 5 5 5 5 5 2 No opinion 2 2 2 5 5 2 No opinion 2 2 5 5 5 No opinion 2 5 2 2 No opinion No opinion No opinion 2 No opinion No opinion 1 1 1 1 no option for "not yet, planning to support". left those as "no opinion".
Eric Stephan 5 5 5 5 5 1 1 No opinion 4 4 1 1 1 3 1 1 5 5 4 5 No opinion No opinion No opinion No opinion No opinion No opinion 5 No opinion No opinion No opinion No opinion No opinion No opinion No opinion Features are being driven by scientific use cases. We envision coverage evolving over time.
Amir Sezavar Keshavarz 5 5 5 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 5 3 3 2 2 2 5 3 3 2 2 3 3 Except collection and membership, Entity, Activity, and Agent can be consumed and produced. Relations such as Generation, Usage, Derivation and others are consumed.
Simon Miles 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 4 5 5 5 5 5 5 5 5 5 5 5 5 As a general framework, PROVoKing supports the in-memory creation of all the PROV constructs. It supports the serialisation and deserialisation of PROV documents containing these constructs, currently to/from RDF only. Bundles are serialised only as typed entities, not as named graphs or similar, as PROV-O does not dictate how to do this. Example code is provided producing and consuming the provenance data from PROV Primer, and a separate example illustrating the remaining concepts.
Olaf Hartig 4 4 4 4 4 2 2 2 4 4 2 4 2 2 2 2 2 2 2 4 2 2 2 2 2 2 2 2 2 2 2 2 2 2 The default provenance information generated by the Triplify metadata component does _not_ make use of PROV. Instead, this information is described using the Provenance Vocabulary. However, since the Provenance Vocabulary is a specialization of PROV-O, the PROV concepts selected above are supported indirectly by the metadata component of Triplify.
Paolo Missier 4 4 4 4 4 2 4 4 4 4 2 2 2 2 2 2 2 2 2 2 2 4 2 2 2 2 2 2 4 2 2 1 1 2 the seed PROV graph may only contain the relations stated above, and it will consequently generate (many more instances of) relations of the same type.
Attributes are used in the seed graph to specify how the generation should take place
Edoardo Pignotti 4 4 4 4 4 4 4 4 4 No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion We use PROV-O to capture the provenance about the history of a simulation. Here, entities are assertions and retractions in the ontology of the model’s state (whether expressed as settings of objects’ instance variables or as facts in a fact-base in a declarative model). Activities are rules, actions, methods and simulation steps that change the state, and agents are software entities performing the activities within the model. Agent : Agents in the model are software entities that are responsible for a change in the model’s state. In OBIAMA, all actions must be associated with an agent.

Activity : There are two subclasses of activity. One sub- class is an Action e.g. model initialisation, the other is assertion or retraction which describe the process of asserting or retracting OWL axioms in the model’s state.

Entity : OWL Axioms in the model state ontology.

wasGeneratedBy : Assertions generate OWL Axioms.

used : Retractions use the retracted Axiom.

wasAttributedTo : Axioms are attributed to the Agent that caused the change.

wasAssociatedWith : Agents are associated with the Actions that they perform.

wasInformedBy : Inferences between assertions and retractions based on the Action performed by an Agent.

wasDerivedFrom : Axioms were derived from other axioms if the Actions used them.
Jacco van Ossenbruggen 4 4 4 4 4 2 4 4 4 2 2 4 2 2 2 2 4 2 4 4 2 2 2 2 2 2 2 2 2 2 2 2 2 4
Olaf Hartig 4 4 4 4 4 2 2 2 4 4 2 4 2 2 2 2 2 2 2 4 2 2 2 2 2 2 2 2 2 2 2 2 2 2 The default provenance information generated by the D2R Server metadata component does _not_ make use of PROV. Instead, this information is described using the Provenance Vocabulary. However, since the Provenance Vocabulary is a specialization of PROV-O, the PROV concepts selected above are supported indirectly by the metadata component of D2R Server.
Trung Dong Huynh 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5
Trung Dong Huynh 4 4 4 4 4 2 4 2 4 2 4 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 4 2 4 4 An example: https://provenance.ecs.soton.ac.uk/store/bundles/55/
Reza B'Far 5 5 5 4 4 3 No opinion No opinion No opinion No opinion 5 5 2 5 No opinion 5 5 5 5 4 4 2 3 No opinion 4 No opinion No opinion No opinion 4 No opinion No opinion No opinion No opinion No opinion
Jun Zhao 4 4 4 4 4 2 2 2 4 4 2 4 2 2 2 2 2 2 2 4 2 2 2 2 2 2 2 2 2 2 2 2 2 2 The default provenance information generated by the Pubby metadata component does _not_ make use of PROV. Instead, this information is described using the Provenance Vocabulary. However, since the Provenance Vocabulary is a specialization of PROV-O, the PROV concepts selected above are supported indirectly by the metadata component of Pubby.
Satya Sahoo 3 3 3 3 3 No opinion No opinion No opinion No opinion No opinion 3 3 No opinion No opinion No opinion No opinion 3 No opinion 3 3 No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion 3 3 No opinion No opinion SemPoD uses provenance information as contextual metadata for data integration and answering user queries based on provenance of the proteomics data. SemPoD is underpinned by the SysPro ontology, which extends PROV-O, to implement an intuitive query environment for use by proteomics researchers at Case CPB.
Mohamed Morsey 5 No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion
Chris Baillie 5 5 5 5 5 No opinion 5 5 5 No opinion No opinion No opinion 5 No opinion No opinion No opinion No opinion No opinion No opinion 5 No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion To perform quality assessment, an activity must examine the context around data. We argue that provenance should be an important part of this context as it can provide information about data sources, the method used in data creation, and how the data has transformed over time - including who had access to the data, who processed it, and how the data was previously assessed. Similarly, provenance has been identified as a key component in enabling data re-use and so capturing the provenance of previous quality assessment is essential if agents are to make decisions about quality score re-use.
Stephan Zednik 4 4 4 4 4 4 4 4 4 1 1 1 2 1 1 1 4 4 4 1 4 1 1 2 1 2 1 4 1 4 1 4 4 4
Landong Zuo 5 5 5 4 4 2 4 4 4 4 5 5 No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion 5 No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion
Tom De Nies 5 5 5 5 5 2 5 5 5 2 2 2 2 5 5 5 5 5 5 2 5 5 5 5 5 1 5 5 5 5 2 5 5 2 Since the focus of this research project is mainly around the detection of revisions and primary sources of news articles, only the features relating to these concepts were implemented. The rationale for read/write support instead of just read support is because the application both generates and consumes provenance. Provenance is generated from the content, and the original provenance, specified in PROV-N or PROV-JSON is consumed to serve as ground truth to measure the precision of the approach.
Rinke Hoekstra 4 4 4 2 4 No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion 4 4 No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion
Paolo Missier 3 3 3 3 3 2 3 1 3 No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion 3 No opinion 3 No opinion No opinion No opinion No opinion No opinion 3 3 No opinion No opinion 3 No opinion No opinion The original aim of the project was to determine whether the Neo4J DB would be suitable to accommodate PROV instances, and if not, to identify the shortcomings (i.e., for N-ary relations, N>2).
The code reads in PROV-N and generates a graph into Neo4J.
For the time being, relations have been simplified to be binary.
Peter Slaughter 4 4 4 4 4 No opinion No opinion No opinion No opinion No opinion 4 No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion 4 4 No opinion No opinion No opinion 4 No opinion The PROV features that we use are only those that are necessary to represent our data model in the PROV-XML serialization.
David Corsar 5 5 5 5 5 5 5 5 5 5 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 An API was required to create and manipulate provenance graphs using PROV core terms, which could be implemented using different technologies.
Edoardo Pignotti 5 5 5 5 5 No opinion 5 5 No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion SPIN-SPARQL rules are used to infer the provenance of the policy reasoning process based on the context surrounding a policy activation and execution.
David Corsar 4 4 4 4 4 2 4 2 4 2 2 2 2 2 2 2 4 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
Peer Brauer 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 2 2 2 2 2 2 2 2 2 5 2 2 2 2 5
Trung Dong Huynh 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5
Timothy Lebo 4 4 4 4 4 4 4 4 4 No opinion 4 4 No opinion No opinion No opinion No opinion No opinion No opinion No opinion 4 No opinion No opinion 4 4 No opinion No opinion No opinion No opinion 4 No opinion 4 No opinion 4 4 Based on SPARQL query of the triple store that csv2rdf4lod-automation propulates, with the following term counts:

prov:Activity 85,182
prov:Association 77,249
prov:Plan 10
prov:agent 487
prov:alternateOf 6,633
prov:atLocation 7055
prov:endedAtTime 836
prov:generatedAtTime 8688
prov:hadPlan 83,628
prov:qualifiedAssociation 83,287
prov:specializationOf 28,443
prov:startedAtTime 7,664
prov:used 35,902
prov:value 314
prov:wasAssociatedWith 487
prov:wasAttributedTo 31,6605
prov:wasDerivedFrom 9,873
prov:wasGeneratedBy 84,238
prov:wasInformedBy 6,272
Sara Magliacane 4 4 4 4 4 No opinion 4 No opinion 4 No opinion No opinion No opinion No opinion 4 No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion 5 No opinion No opinion No opinion No opinion No opinion No opinion 5 No opinion No opinion 4 No opinion We use the above-mentioned features both as an output of our provenance reconstruction application and as a way to describe our gold standard.

The current set of features is limited by the currently implemented reconstruction algorithms, but we may want to expand it to consider also other features.
Timothy Lebo 4 4 4 4 4 2 4 4 4 2 4 2 2 2 2 2 2 4 4 4 4 4 4 4 2 2 2 4 4 4 2 2 4 2
Timothy Lebo 5 5 5 2 2 2 2 2 2 5 5 5 2 2 2 2 2 2 2 2 2 2 5 5 5 2 5 2 2 5 5 2 5 5
Ashley Smith No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion No opinion - because the implementation is currently under heavy development and features may be added or dropped.
Spyros Kotoulas 5 3 5 2 2 2 5 2 2 2 2 2 2 2 2 3 4 2 4 2 2 2 2 2 2 2 2 2 2 2 2 2 5 2
Chester Chen 5 5 5 5 5 5 1 5 5 5 5 1 5 1 1 5 5 5 5 1 2 1 2 1 5 5 5 5 5 5 5 5 5 5

6. Does this implementation support any constraints or inferences defined by the PROV-CONSTRAINTS?

If you have an implementation of the constraints, we encourage you to test their compatibility with our test-cases and report your results. Please see the Constraints Test Cases document for more details.

Summary

ChoiceAll responders
Results
yes 2
no 39

Details

Responder Does this implementation support any constraints or inferences defined by the PROV-CONSTRAINTS?If
Danius Michaelides no
Michael Jewell no
Daniel Garijo no
Stian Soiland-Reyes no
Trung Dong Huynh no
Clément Caron no
Luc Moreau no A constraint validator is built on top of the toolbox but as a separate package.
Hook Hua no
Eric Stephan no
Amir Sezavar Keshavarz no
Simon Miles no
Olaf Hartig no
Paolo Missier no
Edoardo Pignotti no
Jacco van Ossenbruggen no
Olaf Hartig no
Trung Dong Huynh no
Trung Dong Huynh no
Reza B'Far yes
Jun Zhao no
Satya Sahoo no
Mohamed Morsey no
Chris Baillie no
Stephan Zednik no
Landong Zuo no
Tom De Nies no
Rinke Hoekstra no
Paolo Missier no
Peter Slaughter no
David Corsar no
Edoardo Pignotti no
David Corsar no
Peer Brauer yes We are currently implementing the support for constraints and invariants
Trung Dong Huynh no
Timothy Lebo no
Sara Magliacane no
Timothy Lebo no
Timothy Lebo no
Ashley Smith no
Spyros Kotoulas no
Chester Chen no But PROV-constraints will be part of the unit tests suits

7. Framework Usage

What tools or frameworks does your implementation use? How are you using them?

Details

Responder Framework Usage
Danius Michaelides ProvJSON, RDFLib
Michael Jewell We use provpy within a Django application. This lets us export to PROV-N and PROV-JSON, and will support RDF serialization in future.
Daniel Garijo Jena
Stian Soiland-Reyes Alibaba (See https://github.com/wf4ever/taverna-prov/tree/master/prov-taverna-owl-bindings )
Trung Dong Huynh No
Clément Caron OpenRDF Sesame
Luc Moreau For XML: JAXB, for RDF: Sesame, for Json: Gson, for prov: ANTL
Hook Hua Jena, RDFReactor, Virtuoso, Joseki
Eric Stephan protege owl api, alibaba, sesame.
Amir Sezavar Keshavarz ProvToolbox (https://github.com/lucmoreau/ProvToolbox)
Use of different projects in ProvToolbox namely: prov-dot, prov-interop, prov-json, and prov-xml.
Simon Miles Jena is currently required for deserialising PROV data.
Olaf Hartig none
Paolo Missier ProvToolbox/prov-n and its dependants, for generation.
The git repo this implementation lives in is forked from https://github.com/lucmoreau/ProvToolbox
Please see note above on prov-java, this has been added to the same forked git repo.
PROV-N parser in the toolbox used for interpreting the seed graph and for generating the expanded graph.
Edoardo Pignotti OWLAPI version 2
Jacco van Ossenbruggen We use the RDF libraries of SWI-Prolog
Olaf Hartig Jena (used for managing RDF data in D2R server; the metadata component uses it for generating RDF data that represents metadata statements and for serializing RDF data)
Trung Dong Huynh - PROV Python library (http://pypi.python.org/pypi/prov) provides the core provenance implementation
- Django (www.djangoproject.com): Web application framework
- Tastypie (https://github.com/toastdriven/django-tastypie): REST interface
Trung Dong Huynh The Prov python package
Reza B'Far Everything is built from scratch.
Jun Zhao Jena (used for querying SPARQL endpoints, for generating RDF data that represents metadata statements, and for serializing RDF data)
Satya Sahoo Ruby-on-Rails and XML-based OWL parser
Mohamed Morsey Jena, VAADIN, Weka, Azure Bing Search APIs, azure-bing-search-java framework, H2Database
Chris Baillie Jena, SPIN
Stephan Zednik RDFLib in scripts to generate RDF, Jena/ELDA are being looked at as potential frameworks for user tools to explore the NCA provenance.
Landong Zuo Jena 2.7.2 for data modelling and RDF serialisation.

NG4J for named graph and digitial signature.
Tom De Nies - Named Entity Recognition tools such as OpenCalais, DBpedia Spotlight and AlchemyAPI are used to create feature vectors of the documents to derive provenance about.
- Hibernate is used to store documents and their provenance in a custom data model
- Since we only needed a smaller part of the spec for our use case, we opted to build our own internal Java framework to handle the provenance. In other words, we did not use the provenance toolkit.
Rinke Hoekstra Python RDFLib
Paolo Missier ProvToolbox:

<dependency>
<groupId>org.openprovenance.prov</groupId>
<artifactId>prov-n</artifactId>
<version>0.0.1-SNAPSHOT</version>
</dependency>
Peter Slaughter
David Corsar Jena for the Jena implementation of the API.
Edoardo Pignotti SPIN-SPARQL, Jena
David Corsar Jena
Peer Brauer EMF (Java Object Model of ProvO), JaxWS, EMF-Validation Framework (invariants & constraints), XText & Xtend (Text to model transformation)
Trung Dong Huynh Django Object-relational Mapping is used for persisting provenance documents into databases
Timothy Lebo Sesame, raptor, serdi, Virtuoso, jena, saxon, javacsv2.0, apache commons, RDF::Trine,
Sara Magliacane We use the ProvToolbox (https://github.com/lucmoreau/ProvToolbox) to create the provenance graphs and display them as DOT files.
Timothy Lebo raptor, SuRF, sadi.py, csv2rdf4lod-automation, Virtuoso, rdflib, ckanclient, CKAN, Twisted
Timothy Lebo XSL
Ashley Smith Raptor/Rapper and Graphite for RDF parsing and serialisation.
Spyros Kotoulas Jena, DB2 RDF extensions, Jenabean (used for serialization of provenance information).
Chester Chen No, we are not currently using any toolkit at the moment. Our implementation is greatly different from the ones available from other domain, that we might have to build it from scratch.

8. Provenance Exchange

Has this implementation been used to consume a PROV serialization generated by another tool? If so, please identify the other tool and describe how it was used.

Details

Responder Provenance Exchange
Danius Michaelides
Michael Jewell
Daniel Garijo
Stian Soiland-Reyes No, export only.
Trung Dong Huynh No
Clément Caron
Luc Moreau TO complete: translator is used in the soton infranstructure for various services.
Hook Hua
Eric Stephan
Amir Sezavar Keshavarz This framework is generic and can be instantiated in different domains. Other tools may provide a provenance graph in any supported representation such as XML, JSON, or RDF.
Also the output of this framework was exported to provenance validation system.
Simon Miles Yes, it has produced PROV-O data for all above PROV concepts, and this has been successfully consumed by PROV Toolkit.
Olaf Hartig
Paolo Missier no
Edoardo Pignotti N/A
Jacco van Ossenbruggen
Olaf Hartig
Trung Dong Huynh The web service can consume provenance documents in PROV-JSON (via its Web/REST interfaces)
It currently consumes provenance documents deposited by the PROV validator and translator by Luc Moreau.
Trung Dong Huynh No
Reza B'Far N/A
Jun Zhao No
Satya Sahoo No
Mohamed Morsey
Chris Baillie
Stephan Zednik No.
Landong Zuo no yet.
Tom De Nies
Rinke Hoekstra No
Paolo Missier not yet. This is now being used in a new project but no interop testing is available as yet.
Peter Slaughter No.
David Corsar No
Edoardo Pignotti No
David Corsar No
Peer Brauer
Trung Dong Huynh No. But the library exports all PROV-N statements and all of them can be consumed by the Provenance Validator/Translator by Luc Moreau
Timothy Lebo No. Only tools which I've written myself.
Sara Magliacane No.
Timothy Lebo No; only implementation that I wrote.
Timothy Lebo No.
Ashley Smith Not currently.
Spyros Kotoulas No
Chester Chen Not at the moment, but our platform is built with API layers which takes JSON as the serialization format. How to exchange with other provenance server in other formats will be considered in the later phase.

More details on responses

  • Timothy Lebo: last responded on 7, February 2013 at 16:54 (UTC)
  • Sara Magliacane: last responded on 7, February 2013 at 17:24 (UTC)
  • Timothy Lebo: last responded on 13, February 2013 at 14:59 (UTC)
  • Timothy Lebo: last responded on 13, February 2013 at 15:00 (UTC)
  • Ashley Smith: last responded on 15, February 2013 at 15:36 (UTC)
  • Spyros Kotoulas: last responded on 19, February 2013 at 14:54 (UTC)
  • Chester Chen: last responded on 6, April 2013 at 21:19 (UTC)

Everybody has responded to this questionnaire.


Compact view of the results / list of email addresses of the responders

WBS home / Questionnaires / WG questionnaires / Answer this questionnaire