Code Examples
Setup
The following code examples require installation of certain packages and authentication from your ESS-DIVE account. Follow the instructions for setting up the Dataset API for your preferred coding language before trying out the search code examples:
Setup and TroubleshootCreate Metadata
The metadata example provided here is from the ESS-DIVE sandbox site: https://data-sandbox.ess-dive.lbl.gov/#view/doi:10.3334/CDIAC/spruce.001.
Format JSON-LD Metadata in Python
Setup the JSON for the “provider”, which includes details about the project. Simply update the "value" to use the desired project identifier, lookup project identifiers via ESS-DIVE's project list: https://data.ess-dive.lbl.gov/projects. The project will be listed as the publisher in the citation.
provider_spruce = {
"identifier": {
"@type": "PropertyValue",
"propertyID": "ess-dive",
"value": "1e6d50d3-9532-43fb-a63f-bdcb4350bf0c"
}
}Prepare the dataset authors in the order that you would like them to appear in the citation. Please add the ORCID for all authors, especially the first author, if possible.
creators = [
{
"@id": "http://orcid.org/0000-0001-7293-3561",
"givenName": "Paul J",
"familyName": "Hanson",
"affiliation": "Oak Ridge National Laboratory",
"email": "[email protected]"
},
{
"givenName": "Jeffrey",
"familyName": "Riggs",
"affiliation": "Oak Ridge National Laboratory"
},
{
"givenName": "C",
"familyName": "Nettles",
"affiliation": "Oak Ridge National Laboratory"
},
{
"givenName": "William",
"familyName": "Dorrance",
"affiliation": "Oak Ridge National Laboratory"
},
{
"givenName": "Les",
"familyName": "Hook",
"affiliation": "Oak Ridge National Laboratory"
}
]Create the rest of the JSON-LD object
json_ld = {
"@context": "http://schema.org/",
"@type": "Dataset",
"@id": "http://dx.doi.org/10.3334/CDIAC/spruce.001",
"name": "SPRUCE S1 Bog Environmental Monitoring Data: 2010-2016",
"description": [
"This data set reports selected ambient environmental monitoring data from the S1 bog in Minnesota for the period June 2010 through December 2016. Measurements of the environmental conditions at these stations will serve as a pre-treatment baseline for experimental treatments and provide driver data for future modeling activities.",
"The site is the S1 bog, a Picea mariana [black spruce] - Sphagnum spp. bog forest in northern Minnesota, 40 km north of Grand Rapids, in the USDA Forest Service Marcell Experimental Forest (MEF). There are/were three monitoring sites located in the bog: Stations 1 and 2 are co-located at the southern end of the bog and Station 3 is located north central and adjacent to an existing U.S. Forest Service monitoring well.",
"There are eight data files with selected results of ambient environmental monitoring in the S1 bog for the period June 2010 through December 2016. One file has the ",
"other seven have the available data for a given calendar year. Not all measurements started in June 2010 and EM3 measurements ended in May 2014.",
"Further details about the data package are in the attached pdf file (SPRUCE_EM_DATA_2010_2016_20170620)."
],
"creator": creators,
"datePublished": "2015",
"keywords": [
"EARTH SCIENCE > BIOSPHERE > VEGETATION",
"Climate Change"
],
"variableMeasured": [
"EARTH SCIENCE > ATMOSPHERE > ATMOSPHERIC TEMPERATURE > SURFACE TEMPERATURE > AIR TEMPERATURE",
"EARTH SCIENCE > ATMOSPHERE > ATMOSPHERIC WATER VAPOR > WATER VAPOR INDICATORS > HUMIDITY > RELATIVE HUMIDITY",
"EARTH SCIENCE > ATMOSPHERE > ATMOSPHERIC PRESSURE > SEA LEVEL PRESSURE",
"EARTH SCIENCE > ATMOSPHERE > ATMOSPHERIC TEMPERATURE > SURFACE TEMPERATURE > DEW POINT TEMPERATURE > DEWPOINT DEPRESSION",
"EARTH SCIENCE > ATMOSPHERE > ATMOSPHERIC WINDS > SURFACE WINDS > WIND SPEED",
"EARTH SCIENCE > ATMOSPHERE > ATMOSPHERIC WINDS > SURFACE WINDS > WIND DIRECTION",
"EARTH SCIENCE > BIOSPHERE > VEGETATION > PHOTOSYNTHETICALLY ACTIVE RADIATION",
"EARTH SCIENCE > ATMOSPHERE > ATMOSPHERIC RADIATION > NET RADIATION",
"EARTH SCIENCE > LAND SURFACE > SURFACE RADIATIVE PROPERTIES > ALBEDO",
"EARTH SCIENCE > LAND SURFACE > SOILS > SOIL TEMPERATURE",
"Precipitation (Total)",
"Irradiance",
"Groundwater Temperature",
"Groundwater Level",
"Volumetric Water Content",
"surface_albedo"
],
"license": "http://creativecommons.org/licenses/by/4.0/",
"spatialCoverage": [
{
"description": "Site ID: S1 Bog Site name: S1 Bog, Marcell Experimental Forest Description: The site is the 8.1-ha S1 bog, a Picea mariana [black spruce] - Sphagnum spp. ombrotrophic bog forest in northern Minnesota, 40 km north of Grand Rapids, in the USDA Forest Service Marcell Experimental Forest (MEF). The S1 bog was harvested in successive strip cuts in 1969 and 1974 and the cut areas were allowed to naturally regenerate. Stations 1 and 2 are located in a 1974 strip that is characterized by a medium density of 3-5 meter black spruce and larch trees with an open canopy. The area was suitable for siting a monitoring station for representative meteorological conditions on the S1 bog. Station 3 is located in a 1969 harvest strip that is characterized by a higher density of 3-5 meter black spruce and larch trees with a generally closed canopy. Measurements at this station represent conditions in the surrounding stand. Site Photographs are in the attached document",
"geo": [
{
"name": "Northwest",
"latitude": 47.50285,
"longitude": -93.48283
},
{
"name": "Southeast",
"latitude": 47.50285,
"longitude": -93.48283
}
]
}
],
"funder": {
"@id": "http://dx.doi.org/10.13039/100006206",
"name": "U.S. DOE > Office of Science > Biological and Environmental Research (BER)"
},
"temporalCoverage": {
"startDate": "2010-07-16",
"endDate": "2016-12-31"
},
"editor": {
"@id": "http://orcid.org/0000-0001-7293-3561",
"givenName": "Paul J",
"familyName": "Hanson",
"email": "[email protected]"
},
"provider": provider_spruce,
"measurementTechnique": [
"The stations are equipped with standard sensors for measuring meteorological parameters, solar radiation, soil temperature and moisture, and groundwater temperature and elevation. Note that some sensor locations are relative to nearby vegetation and bog microtopographic features (i.e., hollows and hummocks). See Table 1 in the attached pdf (SPRUCE_EM_DATA_2010_2016_20170620) for a list of measurements and further details. Sensors and data loggers were initially installed and became operational in June, July, and August of 2010. Additional sensors were added in September 2011. Station 3 was removed from service on May 12, 2014.",
"These data are considered at Quality Level 1. Level 1 indicates an internally consistent data product that has been subjected to quality checks and data management procedures. Established calibration procedures were followed."
]
}Please refer to the API documentation to understand the schema and navigate through any errors: https://api-sandbox.ess-dive.lbl.gov
Format JSON-LD Metadata in R
Due to R complex JSON-LD support limitations, you need to create a text file of your JSON-LD and add it’s directory in the following read_file function.
Here’s an example for a JSON-LD file located on our ESS-DIVE package service examples github repository.
To make sure your file is properly saved in the JSON-LD format, consider using the Atom text editor (https://atom.io)
Download the file and load it into your script.
json_file <- read_file("~/directory/to/your/jsonld/file")Format JSON-LD Metadata in Java
Setup the JSON definitions to build your JSON_LD.
//JSON objects variables
JSONObject provider_spruce_json = new JSONObject();
JSONObject member = new JSONObject();
JSONObject funder = new JSONObject();
JSONObject temporalCoverage = new JSONObject();
JSONObject editor = new JSONObject();
JSONObject spatial_coverage_json = new JSONObject();
JSONObject primary_Creator = new JSONObject();
JSONObject secondary_Creator = new JSONObject();
JSONObject geo_northwest = new JSONObject();
JSONObject geo_southeast = new JSONObject();
JSONObject JSON_LD = new JSONObject();
JSONArray creators_json = new JSONArray();
JSONArray spatial_coverage_array = new JSONArray();
JSONArray geo = new JSONArray();
JSONArray measurementTechnique = new JSONArray();
JSONArray JSON_LD_Description = new JSONArray();
JSONArray keywords = new JSONArray();
JSONArray variableMeasured = new JSONArray();Now fill the details about the “provider”. This is the details about the project. The project will be listed as the publisher in the citation.
// JSON_LD member assignment
member.put("@id","http://orcid.org/0000-0001-7293-3561");
member.put("givenName","Paul J");
member.put("familyName","Hanson");
member.put("email","[email protected]");
member.put("jobTitle","Principal Investigator");
// JSON_LD provider spruce assignment
provider_spruce_json.put("name","SPRUCE");
provider_spruce_json.put("member",member);Prepare the dataset authors in the order that you would like them to appear in the citation. Please add the ORCID for all authors, especially the first author, if possible.
// JSON_LD primary creator assignment
primary_Creator.put("@id","http://orcid.org/0000-0001-7293-3561");
primary_Creator.put("givenName","Paul J");
primary_Creator.put("familyName","Hanson");
primary_Creator.put("affiliation","Oak Ridge National Laboratory");
primary_Creator.put("email","[email protected]");
// JSON_LD secondary creator assignment
secondary_Creator.put("givenName","Jeffrey");
secondary_Creator.put("familyName","Riggs");
secondary_Creator.put("affiliation","Oak Ridge National Laboratory");
// Define as many creators as you need into newer JSON Objects and add them to the creators_json_array
creators_json.add(primary_Creator);
creators_json.add(secondary_Creator);Initialize JSON_LD strings
// JSON_LD Strings arrays
variableMeasured.add("EARTH SCIENCE > ATMOSPHERE > ATMOSPHERIC TEMPERATURE > SURFACE TEMPERATURE > AIR TEMPERATURE");
variableMeasured.add("EARTH SCIENCE > ATMOSPHERE > ATMOSPHERIC WATER VAPOR > WATER VAPOR INDICATORS > HUMIDITY > RELATIVE HUMIDITY");
variableMeasured.add("EARTH SCIENCE > ATMOSPHERE > ATMOSPHERIC PRESSURE > SEA LEVEL PRESSURE");
variableMeasured.add("EARTH SCIENCE > ATMOSPHERE > ATMOSPHERIC TEMPERATURE > SURFACE TEMPERATURE > DEW POINT TEMPERATURE > DEWPOINT DEPRESSION");
variableMeasured.add("EARTH SCIENCE > ATMOSPHERE > ATMOSPHERIC WINDS > SURFACE WINDS > WIND SPEED");
variableMeasured.add("EARTH SCIENCE > ATMOSPHERE > ATMOSPHERIC WINDS > SURFACE WINDS > WIND DIRECTION");
variableMeasured.add("EARTH SCIENCE > BIOSPHERE > VEGETATION > PHOTOSYNTHETICALLY ACTIVE RADIATION");
variableMeasured.add("EARTH SCIENCE > ATMOSPHERE > ATMOSPHERIC RADIATION > NET RADIATION");
variableMeasured.add("EARTH SCIENCE > LAND SURFACE > SURFACE RADIATIVE PROPERTIES > ALBEDO");
variableMeasured.add("EARTH SCIENCE > LAND SURFACE > SOILS > SOIL TEMPERATURE");
variableMeasured.add("Precipitation (Total)");
variableMeasured.add("Irradiance");
variableMeasured.add("Groundwater Temperature");
variableMeasured.add("Groundwater Level");
variableMeasured.add("Volumetric Water Content");
variableMeasured.add("surface_albedo");
measurementTechnique.add("The stations are equipped with standard sensors for measuring meteorological parameters, solar radiation, soil temperature and moisture, and groundwater temperature and elevation. Note that some sensor locations are relative to nearby vegetation and bog microtopographic features (i.e., hollows and hummocks). See Table 1 in the attached pdf (SPRUCE_EM_DATA_2010_2016_20170620) for a list of measurements and further details. Sensors and data loggers were initially installed and became operational in June, July, and August of 2010. Additional sensors were added in September 2011. Station 3 was removed from service on May 12, 2014.");
measurementTechnique.add("These data are considered at Quality Level 1. Level 1 indicates an internally consistent data product that has been subjected to quality checks and data management procedures. Established calibration procedures were followed.");
JSON_LD_Description.add("This data set reports selected ambient environmental monitoring data from the S1 bog in Minnesota for the period June 2010 through December 2016. Measurements of the environmental conditions at these stations will serve as a pre-treatment baseline for experimental treatments and provide driver data for future modeling activities.");
JSON_LD_Description.add("The site is the S1 bog, a Picea mariana [black spruce] - Sphagnum spp. bog forest in northern Minnesota, 40 km north of Grand Rapids, in the USDA Forest Service Marcell Experimental Forest (MEF). There are/were three monitoring sites located in the bog: Stations 1 and 2 are co-located at the southern end of the bog and Station 3 is located north central and adjacent to an existing U.S. Forest Service monitoring well.");
JSON_LD_Description.add("There are eight data files with selected results of ambient environmental monitoring in the S1 bog for the period June 2010 through December 2016. One file has the ");
JSON_LD_Description.add("other seven have the available data for a given calendar year. Not all measurements started in June 2010 and EM3 measurements ended in May 2014.");
JSON_LD_Description.add("Further details about the data package are in the attached pdf file (SPRUCE_EM_DATA_2010_2016_20170620).");
keywords.add("EARTH SCIENCE > BIOSPHERE > VEGETATION");
keywords.add("Climate Change");Add nested information in JSON_objects.
// JSON_LD spatial coverage assignment
spatial_coverage_json.put("description","Site ID: S1 Bog Site name: S1 Bog, Marcell Experimental Forest Description: The site is the 8.1-ha S1 bog, a Picea mariana [black spruce] - Sphagnum spp. ombrotrophic bog forest in northern Minnesota, 40 km north of Grand Rapids, in the USDA Forest Service Marcell Experimental Forest (MEF). The S1 bog was harvested in successive strip cuts in 1969 and 1974 and the cut areas were allowed to naturally regenerate. Stations 1 and 2 are located in a 1974 strip that is characterized by a medium density of 3-5 meter black spruce and larch trees with an open canopy. The area was suitable for siting a monitoring station for representative meteorological conditions on the S1 bog. Station 3 is located in a 1969 harvest strip that is characterized by a higher density of 3-5 meter black spruce and larch trees with a generally closed canopy. Measurements at this station represent conditions in the surrounding stand. Site Photographs are in the attached document");
spatial_coverage_json.put("geo", geo);
spatial_coverage_array.add(spatial_coverage_json);
// JSON_LD funder assignment
funder.put("@id", "http://dx.doi.org/10.13039/100006206");
funder.put("name", "U.S. DOE > Office of Science > Biological and Environmental Research (BER)");
// JSON_LD temporalCoverage assignment
temporalCoverage.put("startDate","2010-07-16");
temporalCoverage.put("endDate","2016-12-31");
// JSON_LD editor assignment
editor.put("@id", "http://orcid.org/0000-0001-7293-3561");
editor.put("givenName", "Paul J");
editor.put("familyName", "Hanson");
editor.put("email", "[email protected]");
// JSON_LD geo variables assignments
geo_northwest.put("name","Northwest");
geo_northwest.put("latitude",47.50285);
geo_northwest.put("longitude",-93.48283);
geo_southeast.put("name","Southeast");
geo_southeast.put("latitude",47.50285);
geo_southeast.put("longitude",-93.48283);
geo.add(geo_northwest);
geo.add(geo_southeast);Create the rest of the JSON-LD object
// Main JSON_LD
JSON_LD.put("@context","http://schema.org/");
JSON_LD.put("@type","Dataset");
JSON_LD.put("@id","http://dx.doi.org/10.3334/CDIAC/spruce.001");
JSON_LD.put("name","SPRUCE S1 Bog Environmental Monitoring Data: 2010-2016");
JSON_LD.put("description",JSON_LD_Description);
JSON_LD.put("creator",creators_json);
JSON_LD.put("datePublished","2015");
JSON_LD.put("keywords",keywords);
JSON_LD.put("variableMeasured",variableMeasured);
JSON_LD.put("license","http://creativecommons.org/licenses/by/4.0/");
JSON_LD.put("spatialCoverage",spatial_coverage_array);
JSON_LD.put("funder",funder);
JSON_LD.put("temporalCoverage",temporalCoverage);
JSON_LD.put("editor",editor);
JSON_LD.put("provider", provider_spruce_json);
JSON_LD.put("measurementTechnique",measurementTechnique);Submit Dataset
The following lines of code submits and validates JSON-LD metadata for a single dataset.
Metadata Only
Submit Metadata Only in Python
Submit the JSON-LD object with the Dataset API
post_packages_url = "{}{}".format(base,endpoint)
post_package_response = requests.post(post_packages_url,
headers={"Authorization":header_authorization},
json=json_ld)
if post_package_response.status_code == 201:
# Success
response=post_package_response.json()
print(f"View URL:{response['viewUrl']}")
print(f"Name:{response['dataset']['name']}")
else:
# There was an error
print(post_package_response.text)Submit Metadata Only in R
Submit the JSON-LD object with the Dataset API
Make sure to enter your own file path.
E.g. C:\\User\\Files\API_Tutorial.json for Windows & /Users/Files/API_Tutorial.json for Mac
call_post_package <- paste(base,endpoint, sep="/")
post_package = POST(call_post_package,
body = json_file,
add_headers(Authorization=header_authorization,
"Content-Type"="application/json"))Review the results
results = content(post_package)
attributes(results)
$names
[1] "id" "viewUrl" "detail" "errors" "dataset"
results$detail
[1] "Data Package created successfully."
results$errors
NULL
results$viewUrl
[1] "https://data-dev.ess-dive.lbl.gov/view/ess-dive-XXXXXXXXXXXX-20190621T175431086775"Submit Metadata Only in Java
Submit the JSON-LD object with the Dataset API
try{
String url = base + endpoint;
HttpPost request = new HttpPost(url);
StringEntity params = new StringEntity(JSON_LD.toString()); //Setting the JSON-LD Object to the request params
request.addHeader("content-type", "application/json");
request.addHeader("Authorization", header_authorization);
request.setEntity(params);
HttpResponse response;
response = httpClient.execute(request);
HttpEntity entity = response.getEntity();
String responseString = EntityUtils.toString(entity, "UTF-8");
if(response.getStatusLine().getStatusCode() == 201){
System.out.println(response.toString());
System.out.println(responseString);
} else {
System.out.println(response.getStatusLine().getReasonPhrase());
System.out.println(responseString);
}
} catch (Exception ex) {
System.out.print(ex.getMessage().toString());
}Single Data File
Submit Metadata and Single Data File in Python
To submit the JSON-LD object along with data files, you need to create a folder named files and add your desired file to upload inside it.
files_tuples_array = []
upload_file = “path/to/your_file”
files_tuples_array.append((("json-ld", json.dumps(json_ld))))
files_tuples_array.append(("data", open(upload_file ,'rb')))
post_packages_url = "{}{}".format(base,endpoint)
post_package_response = requests.post(post_packages_url,
headers={"Authorization":header_authorization},
files= files_tuples_array)
if post_package_response.status_code == 201:
# Success
response=post_package_response.json()
print(f"View URL:{response['viewUrl']}")
print(f"Name:{response['dataset']['name']}")
else:
# There was an error
print(post_package_response.text)Remember to change the file directories & file names to your actual names. The directory variable can be left blank if your API is already located in the same directory as your file.
Submit Metadata and Single Data File in R
To submit the JSON-LD object along with data files, you need to create a folder named files and add your desired file to upload inside it.
call_post_package <- paste(base,endpoint, sep="/")
post_package = POST(call_post_package, body=list("json-ld"=json_file,
data=upload_file("your-directory/your-file","text/csv")),
add_headers(Authorization=header_authorization,
"Content-Type"="multipart/form-data"))Review the results
content(post_package)$viewUrl
[1] "https://data-dev.ess-dive.lbl.gov/view/ess-dive-XXXXXXXXXXXX-20190621T175431086775"Submit Metadata and Single Data File in Java
To submit the JSON-LD object along with data files, you need to create a folder named files and add your desired file to upload inside it.
try{
String url = base + endpoint;
HttpPost uploadFile = new HttpPost(url);
uploadFile.addHeader("Authorization", header_authorization);
File file_to_upload = new File("/directory-to-your-file/file");
String content = FileUtils.readFileToString(file_to_upload, "UTF-8");
FormBodyPart bodyPart = FormBodyPartBuilder.create()
.setName("files")
.addField("Content-Disposition", "form-data; name=\"data\"; filename=\"<your-file-name>\"")
.setBody(new StringBody(content, ContentType.TEXT_PLAIN))
.build();
MultipartEntityBuilder builder = MultipartEntityBuilder.create()
.setMode(HttpMultipartMode.BROWSER_COMPATIBLE)
.setContentType(ContentType.MULTIPART_FORM_DATA);
builder.addPart("json-ld", new StringBody(JSON_LD.toString(), ContentType.TEXT_PLAIN));
builder.addPart(bodyPart);
uploadFile.setEntity(builder.build());
CloseableHttpResponse response = httpClient.execute(uploadFile);
HttpEntity responseEntity = response.getEntity();
response = httpClient.execute(uploadFile);
HttpEntity entity = response.getEntity();
String responseString = EntityUtils.toString(responseEntity, "UTF-8");
if(response.getStatusLine().getStatusCode() == 201){
System.out.println(response.toString());
System.out.println(responseString);
} else {
System.out.println(response.getStatusLine().getReasonPhrase());
System.out.println(responseString);
}
} catch (Exception ex) {
System.out.print(ex.getMessage().toString());
}Many Data Files
Submit Metadata and Many Data Files in Python
In case you have many files to be uploaded, you can place them all inside the files directory and use the following code:
files_tuples_array = []
files_upload_directory = "your_upload_directory/"
files = os.listdir(files_upload_directory)
files_tuples_array.append((("json-ld", json.dumps(json_ld))))
for filename in files:
file_directory = files_upload_directory + filename
files_tuples_array.append((("data", open(file_directory, 'rb'))))
post_packages_url = "{}{}".format(base,endpoint)
post_package_response = requests.post(post_packages_url,
headers={"Authorization":header_authorization},
files= files_tuples_array)
if post_package_response.status_code == 201:
# Success
response=post_package_response.json()
print(f"View URL:{response['viewUrl']}")
print(f"Name:{response['dataset']['name']}")
else:
# There was an error
print(post_package_response.text)Submit Metadata and Many Data Files in R
See our Dataset API GitHub repository for a complete example in R.
Currently there are no examples of this available in Java.
Edit Dataset
Metadata Only
Edit Metadata in Python
Use the PUT function to update the metadata of a dataset. This example updates the name of a dataset.
dataset_id = "<Enter an ESS-DIVE Identifier here>"
put_package_url = "{}{}/{}".format(base,endpoint, dataset_id)
metadata_update_dict = {"name": "Updated Dataset Name"}
put_package_response = requests.put(put_package_url,
headers={"Authorization":header_authorization},
json=metadata_update_dict)Check the results for the changed metadata attribute
# Check for errors
if put_package_response.status_code == 200:
# Success
response=put_package_response.json()
print(f"View URL:{response['viewUrl']}")
print(f"Name:{response['dataset']['name']}")
else:
# There was an error
print(put_package_response.text)Edit Metadata in R
Use the PUT function to update the metadata of a dataset. This example updates the name of a dataset.
call_put_package <- paste(base,endpoint,get_package_json$id, sep="/")
put_package = PUT(call_put_package,
body = "{ \"name\": \"My Tutorial Title\" }",
add_headers(Authorization=header_authorization,
"Content-Type"="application/json"))Transform the result into a data frame. (Ignore the warning message)
put_package_text <- content(put_package, "text")
put_package_json <- fromJSON(put_package_text)Check the results for the changed metadata attribute
# Check for errors
if(!http_error(put_package) ){
attributes(put_package_json)
put_package_json$viewUrl
put_package_json$dataset$name
}else {
http_status(put_package)
print(put_package_text)
}
[1] "Data Package updated successfully."
NULL
[1] "https://data-dev.ess-dive.lbl.gov/view/ess-dive-xxx-20190621T185438900719"
[1] "My Tutorial Title"Edit Metadata in Java
Use the PUT function to update the metadata of a dataset. This example updates the name of a dataset.
String id = "<Enter-your-dataset-id>";
JSONObject JSON_LD_update = new JSONObject();
JSON_LD_update.put("name","Updated dataset title");
try{
String url = base + endpoint + "/" + id;
HttpPut request = new HttpPut(url);
StringEntity params = new StringEntity(JSON_LD_update.toString()); //Setting the JSON-LD Object to the request params
request.addHeader("content-type", "application/json");
request.addHeader("Authorization", header_authorization);
request.setEntity(params);
HttpResponse response;
response = httpClient.execute(request);
HttpEntity entity = response.getEntity();
String responseString = EntityUtils.toString(entity, "UTF-8");
System.out.println(response.getStatusLine().getStatusCode());
System.out.println(response.toString());
System.out.println(response.getStatusLine());
if(response.getStatusLine().getStatusCode() == 200){
System.out.println(response.toString());
System.out.println("Dataset updated");
System.out.println(responseString);
} else {
System.out.println(response.getStatusLine().getReasonPhrase());
System.out.println(responseString);
}
} catch (Exception ex) {
System.out.print(ex.getMessage().toString());
}
Metadata and Data
Edit Metadata and Data in Python
Use the PUT function to update a dataset. This example updates the date published to 2019 of a dataset and adds a new data file.
dataset_id = "<Enter an ESS-DIVE Identifier here>"
files_tuples_array = []
upload_file = "path/to/your_file"
files_tuples_array.append((("json-ld", json.dumps(metadata_update_dict))))
files_tuples_array.append(("data", open(upload_file ,'rb')))
put_package_url = "{}{}/{}".format(base,endpoint, dataset_id)
put_package_response = requests.put(put_package_url,
headers={"Authorization":header_authorization},
files= files_tuples_array)Check the results for the changed metadata attribute and newly uploaded file
# Check for errors
if put_package_response.status_code == 200:
# Success
response=put_package_response.json()
print(f"View URL:{response['viewUrl']}")
print(f"Date Published:{response['dataset']['datePublished']}")
print(f"Files In Dataset:{response['dataset']['distribution']}")
else:
# There was an error
print(put_package_response.text)get_packages_url = "{}{}".format(base,endpoint)
get_packages_response = requests.get(get_packages_url,
headers={"Authorization":header_authorization})
if get_packages_response.status_code == 200:
#Success
print(get_packages_response.json())
else:
# There was an error
print(get_packages_response.text)Edit Metadata and Data in R
Use the PUT function to update a dataset. This example updates the date published to 2019 of a dataset and adds a new data file.
call_put_package <- paste(base,endpoint,put_package_json$id, sep="/")
put_package_data = PUT(call_put_package,
body=list("json-ld"="{ \"datePublished\": \"2019\" }",
data=upload_file("your-directory/your-file","text/csv")),
add_headers(Authorization=header_authorization,
"Content-Type"="multipart/form-data"))Transform the result into a data frame. (Ignore the warning message)
put_package_data_text <- content(put_package_data, "text")
put_package_data_json <- fromJSON(put_package_data_text)Check the results for the changed metadata attribute and newly uploaded file
# Check for errors
if(!http_error(put_package_data) ){
attributes(put_package_data_json)
print(put_package_data_json$detail)
print(put_package_data_json$errors)
print(put_package_data_json$viewUrl)
print(put_package_data_json$dataset$datePublished)
print(put_package_data_json$dataset$distribution)
}else {
http_status(put_package_data)
print(put_package_data_text)
}
[1] "Data Package updated successfully."
NULL
[1] "https://data-dev.ess-dive.lbl.gov/view/ess-dive-XXXX-20190621T191953176893"
[1] "2019"
name encodingFormat
1 <your first file> text/csv
2 <your second file> text/csvCheck for errors and view the data frame on success
# Check for errors
if(!http_error(post_package) ){
print(get_package_json)
}else {
http_status(post_package)
}Edit Metadata and Data in Java
Use the PUT function to update a dataset. This example updates the date published to 2019 of a dataset and adds a new data file.
String id = "<Enter-your-dataset-id>";
JSONObject JSON_LD_update = new JSONObject();
JSON_LD_update.put("name","Updated dataset");
try{
String url = base + endpoint + "/" + id;
HttpPut uploadFile = new HttpPut(url);
uploadFile.addHeader("Authorization", header_authorization);
File file_to_upload = new File("/directory-to-your-file/file");
String content = FileUtils.readFileToString(file_to_upload, "UTF-8");
FormBodyPart bodyPart = FormBodyPartBuilder.create()
.setName("files")
.addField("Content-Disposition", "form-data; name=\"data\"; filename=\"<your-file-name>\"")
.setBody(new StringBody(content, ContentType.TEXT_PLAIN))
.build();
MultipartEntityBuilder builder = MultipartEntityBuilder.create()
.setMode(HttpMultipartMode.BROWSER_COMPATIBLE)
.setContentType(ContentType.MULTIPART_FORM_DATA);
builder.addPart("json-ld", new StringBody(JSON_LD_update.toString(), ContentType.TEXT_PLAIN));
builder.addPart(bodyPart);
uploadFile.setEntity(builder.build());
CloseableHttpResponse response = httpClient.execute(uploadFile);
HttpEntity responseEntity = response.getEntity();
String responseString = EntityUtils.toString(responseEntity, "UTF-8");Check the results for the changed metadata attribute and newly uploaded file
String id = "<Enter-your-dataset-id>";
JSONObject JSON_LD_update = new JSONObject();
JSON_LD_update.put("name","Updated dataset");
try{
String url = base + endpoint + "/" + id;
HttpPut uploadFile = new HttpPut(url);
uploadFile.addHeader("Authorization", header_authorization);
File file_to_upload = new File("/directory-to-your-file/file");
String content = FileUtils.readFileToString(file_to_upload, "UTF-8");
FormBodyPart bodyPart = FormBodyPartBuilder.create()
.setName("files")
.addField("Content-Disposition", "form-data; name=\"data\"; filename=\"<your-file-name>\"")
.setBody(new StringBody(content, ContentType.TEXT_PLAIN))
.build();
MultipartEntityBuilder builder = MultipartEntityBuilder.create()
.setMode(HttpMultipartMode.BROWSER_COMPATIBLE)
.setContentType(ContentType.MULTIPART_FORM_DATA);
builder.addPart("json-ld", new StringBody(JSON_LD_update.toString(), ContentType.TEXT_PLAIN));
builder.addPart(bodyPart);
uploadFile.setEntity(builder.build());
CloseableHttpResponse response = httpClient.execute(uploadFile);
HttpEntity responseEntity = response.getEntity();
String responseString = EntityUtils.toString(responseEntity, "UTF-8");
// Review the results
System.out.println(responseString);
} catch (Exception ex) {
System.out.print(ex.getMessage().toString());
}
if(response.getStatusLine().getStatusCode() == 200){
System.out.println(response.toString());
System.out.println("package updated");
System.out.println(responseString);
} else {
System.out.println(response.getStatusLine().getReasonPhrase());
System.out.println(responseString);
}
} catch (Exception ex) {
System.out.print(ex.getMessage().toString());
}To compile the code and run it, make sure you’re on the parent directory where the essdive.java file is not inside the lib folder.
Now assuming Java is already installed on your machine, we will start by compiling the java code you wrote using the following terminal command:
javac -cp .:"lib/*" essdive.java
This will create a new file that has the compiled code where it can run using the following command:
java -cp .:"lib/*" essdive
Last updated