Links

Check Dataset Metadata Quality

Assessment Reports provide automated dataset metadata quality checks based on FAIR (Findable, Accessible, Interoperable, Reusable) data principles.
ESS-DIVE’s Assessment Reports should be used by dataset creators to assess your dataset metadata quality based on FAIR (Findable, Accessible, Interoperable, Reusable) data principles during submission. In order for your dataset to be approved for publication, you’ll need to ensure you pass all required checks and resolve warnings within the assessment report (Figure 1).
Figure 1. Assessment Report Score and Checks
Assessment Reports display the compiled outcome of the suite of automated checks that are performed whenever a dataset is submitted on ESS-DIVE. Assessment reports become available once a dataset is submitted on ESS-DIVE. The report, while the dataset is private, is only available to the dataset creator and those who have shared access to the private dataset. The assessment report becomes available to all users on ESS-DIVE once the dataset is published.
The assessment reports are used by ESS-DIVE reviewers to assess the quality of the dataset before publication. Not all dataset metadata checks are contained within the assessment report, but the report provides one form of feedback that you will receive during the dataset review process. For a complete list of dataset metadata checks and requirements, see the Dataset Requirements page.
To access an assessment report, navigate to a dataset landing page on ESS-DIVE and select the "Assessment report" button on the right-hand side of the landing page (Figure 2).
Figure 2. Assessment Report Button on Dataset Page

Review Assessment Reports when Submitting Data

Before requesting publication for a dataset, data submitters should review the assessment report to address any failed automated checks or warnings and receive automated feedback on their dataset quality based on FAIR data principles through the assessment report score (e.g., 93/100/100/50). By addressing any failed automated checks or warnings before requesting publication, data submitters can expedite the review and publication process.
Please note that assessment reports can take a few minutes, or up to 24 hours, to generate.
Failed checks indicate that a required field does not follow the automated check criteria (Figure 3). Details on requirements of the check and steps to address the issue will be provided for each failed check. For example, if a dataset does not contain an ORCiD for the dataset contact, the failed check will notify the user that there is no ORCiD present and one should be provided. Similarly if there is no methods section present, the failed check will alert the user that they need to add a methods section within their dataset.
Figure 3. Failed Automated Checks
Warning checks notify that an optional field does not follow the automated check criteria (Figure 4). For example, if a user does not provide a textual description of the geographic coverage, the geographic region automated check will alert the user to provide a geographic region description.
Figure 4. Warning for Automated Check

Resolving Failed Checks and Warnings

Utilize the instructions provided for each check to resolve any failed checks and warnings. The assessment report is run every time a dataset is submitted so once you revise any checks or warnings you can re-review the assessment report to ensure you’ve addressed all issues.
Note, the assessment report can take some time to re-generate after submitting the datasets. If you are seeing any other error other than the assessment report loading page (Figure 5), please contact ESS-DIVE support.
Figure 5. Assessment Report Loading Page

Assessment Report Checks

The below checks are run on each dataset upon submission as a part of the ESS-DIVE automated check suite. Informational checks appear on the assessment reports within their own section and are not pass/fail.
Review the Dataset Requirements Page for more detailed descriptions, formatting requirements, and examples for the automated checks.
Criteria
Required/Optional
FAIR Category
Title length between between 7 and 40 words
Required
Findable
Abstract length is at least 100 words
Required
Findable
Keywords vary from title and at least 3 are present
Required
Findable
Publication date is present
Required
Findable
Creators, at least one is present
Required
Findable
Dataset Contact, ensure contact is present and ORCiD is provided
Required
Findable
URLs in metadata resolve correctly
Required
Findable
Start and End Dates are present
Required
Findable
Project name is from controlled list
Optional
Findable
Funding organization "U.S. DOE > Office of Science > Biological and Environmental Research (BER)" is present
Optional
Findable
Geographic Description is present
Optional
Findable
Coordinates describing the point location or geographic area of the dataset are present
Optional
Findable
Metadata Identifier Resolvable
Optional
Accessible
Methods description is more than 7 words in length
Required
Interoperable
Data file formats are non-proprietary
Optional
Reusable
Usage rights is set to Creative Commons CC-BY license
Optional
Reusable
Informational: Number of contacts with email addresses provided
Informational
Findable
Informational: Number of creators with email addresses provided
Informational
Findable
Informational: Count of data entities present
Informational
Interoperable