Check Dataset Metadata Quality
Assessment Reports provide automated dataset metadata quality checks based on FAIR (Findable, Accessible, Interoperable, Reusable) data principles.
ESS-DIVE’s Assessment Reports provide automated feedback on metadata quality and should be used by dataset creators to assess their metadata before requesting publication. Datasets must pass all required checks to meet ESS-DIVE’s publication requirements (Figure 1).

Assessment Report Automated Review
Dataset metadata undergo an automated review whenever changes are submitted to ESS-DIVE. The automated review uses a standard suite of checks designed based on FAIR (Findable, Accessible, Interoperable, Reusable) data principles (Table 1). The Metadata Assessment Report displays the check results and compiles a score for each FAIR category (e.g. 93/100/100/50).
ESS-DIVE reviewers use the Assessment Report scores during the formal review process to assess the quality of metadata before publication. Not all ESS-DIVE dataset publication requirements are contained within the Assessment Report. For a complete list of both manual and automated metadata checks, see the Dataset Requirements page.
Before publication, the report is only available to the dataset creator and those who have shared access to the private dataset. The Assessment Report becomes available to the public once the dataset is published.
Check Score while Drafting Metadata
Before requesting publication for a dataset, data contributors should review the Assessment Report and address any failed or warning checks. The report provides data contributors immediate feedback on their metadata quality and instructions to revise specific metadata fields. By improving the FAIR category scores before starting a formal review, data contributors can expedite the overall publication process.
Please note that assessment reports can take a few minutes, or up to 24 hours, to generate.
To access an assessment report, navigate to a dataset landing page on ESS-DIVE and select the "Assessment Report" button on the right-hand side of the landing page (Figure 2).

Resolve Failed Checks and Warnings
Failed and warning checks will impact Assessment Report scores. All checks will state any requirements and steps to address the issue. Use the provided instructions to revise your metadata.
Failed checks indicate that a required field does not follow the automated check criteria (Figure 3).

Warning checks indicate that an optional field does not follow the automated check criteria (Figure 4).

After revising and resubmitting your dataset, check your score again to ensure you’ve addressed all issues. If you see any message other than the notice that the score is loading (Figure 5), please contact ESS-DIVE support.
Automated Checks
The checks below are run on each dataset upon submission as a part of the ESS-DIVE automated check suite. Review the Dataset Requirements Page for more detailed descriptions, formatting requirements, and examples for the automated checks.
Informational checks do not impact the Assessment Report score.
Title length between between 7 and 40 words
Required
Findable
Abstract length is at least 100 words
Required
Findable
Publication date is present
Required
Findable
Creators, at least one is present
Required
Findable
Dataset Contact, ensure contact is present
Required
Accessible
Dataset Contact, ensure ORCiD is provided
Required
Accessible
Start and End Dates are present
Required
Findable
Project name is from controlled list
Optional
Findable
Funding organization "U.S. DOE > Office of Science > Biological and Environmental Research (BER)" is present
Optional
Findable
Geographic Description is present
Optional
Findable
Coordinates describing the point location or geographic area of the dataset are present
Optional
Findable
Methods description is more than 7 words in length
Required
Reusable
Data file formats are non-proprietary
Optional
Reusable
Usage rights is set to Creative Commons CC-BY license
Optional
Reusable
Informational: Number of contacts with email addresses provided
Informational
Findable
Informational: Number of creators with email addresses provided
Informational
Findable
Informational: Number of data entities present
Informational
Interoperable
Table 1: List of automated checks performed by the automated assessment suite
Last updated