PRINCIPLE 2 — Timely and Comprehensive
Author: Alison Rygh | Reviewers: Ana Brandusescu, Danny Lämmerhirt
Last updated
Author: Alison Rygh | Reviewers: Ana Brandusescu, Danny Lämmerhirt
Last updated
Open data should be relevant and needs to include enough information about the data (metadata) so that the user will not misinterpret or misuse the data. This principle highlights three key concepts that are important for open data to be timely and comprehensive: planning, prioritizing and consulting users. These three concepts have a common end result of creating public value. A second value statement is that the data is in its original, unmodified form, preferable to aggregated data.
The province of Ontario in Canada, a Charter adopter, has provided a clear outline in the Ontario Open Data Directive about their data collection, standards and publishing processes. They also give examples of “high quality” datasets. The data is published in their catalogue. All datasets have a feedback mechanism to allow the user to contact the provider, include the date added and update frequency, and have the metadata posted with the dataset.
Target audience(s)
Governments
What is currently measured
Elements of Principle 2 that are assessed by leading open data measurement tools are catalogued in Appendix I - Principle 2 Indicator Table and reviewed below.
Commitment P2.a, “Create, maintain, and share public, comprehensive lists of data holdings to support meaningful consultations around data prioritization, publication, and release dates”, is measured by ODB, EODMA, and OURdata. Indicators assess the existence of a data inventory and whether or not this inventory is used to help governments prioritize data set releases.
Commitment P2.b, “Release high-quality open data in a timely manner, without undue delay. Data will be comprehensive and accurate, and released in accordance with prioritization that is informed by consultations with open data users, including citizens, other governments, and civil society and private sector organizations”, is measured by ODB, ODIN, GODI, OURdata, and EODMA. Indicators assess the existence of standardised publication processes, whether government has data quality control processes in place, whether historical and up-to-date data records are available, whether data is easily findable, and whether government allows citizens to give feedback on the quality of data. For timeliness, indicators compare information about publication dates against, sometimes pre-defined, update timeframes.
Commitment P2.c, “To the extent possible, release data in its original, unmodified form, and link data to any relevant guidance, documentation, visualizations, or analyses”, is measured by ODB, EODMA, and OURdata. Indicators assess whether a link is provided that takes the user directly to the data source. Also, the existence of additional data posted about the dataset itself i.e. metadata. Useful pieces of metadata for this commitment are definitions, data visualisations, and guidelines on use of data.
Commitment P2.d, “To the extent possible, release data that is disaggregated to the lowest levels of administration, including disaggregation by gender, age, income, and other categories”, is partially measured by ODB, ODIN, GODI, and OURdata. Indicators test the availability of specific disaggregation categories (e.g. disaggregation by sex for demographic data or by administrative levels). An alternative indicator tests the existence of guidelines requiring government to publish disaggregated datasets.
Commitment P2.e, “Allow users to provide feedback, and continue to make revisions to ensure data quality is improved as necessary”, is measured by ODB, OURdata, and EODMA. Indicators assess to aspects: the existence of feedback mechanisms and the time it takes governments to react to posted feedback and the customer satisfaction with the response.
Commitment P2.f, “Apply consistent information lifecycle management practices, and ensure historical copies of datasets are preserved, archived, and kept accessible as long as they retain value”, is measured by ODB and OURdata. Indicators assess the existence of lifecycle management practices published by government and used to maintain historical copies and what the retention periods are for the different datasets.
Commitment P2.g, “Consult data users on significant changes to the structure or supply of data in order to minimize the impact to users that have created tools based on open data”, is measured by OURdata, ODB, and EODMA. Indicators assess whether government published a date of planned changes to be made to existing datasets. The commitment is also measured by the number of consultations provided by governments to users.
Commitment P2.h, “Be transparent about our own data collection, standards, and publishing processes by documenting these processes online”, is measured by ODB. Indicators assess the existence of published government material about the government processes in the data lifecycle.
All commitments of Principle 2 are measured, but often partially. Prominent indicators include whether or not governments have a list of government held data to share with the public to allow co-development of a priority list of high quality datasets in demand. Indicators to assess published data include data formats, metadata posted with the data (the more the better), several, but by far not all aspects of data quality (e.g. timely publication, disaggregation of data, existence of data quality audits) and once again, a measure of feedback mechanisms for users to contact the providers about the data.Elements of Principle 2 that are assessed by existing global open data measurement tools are catalogued in Appendix I - and reviewed below