Skip to content

Conversation

xylar
Copy link
Collaborator

@xylar xylar commented Apr 17, 2018

This merge updates an XML database describing the observations used in MPAS-Analysis and makes some changes to how this database is displayed as tables in the documentation.

@xylar
Copy link
Collaborator Author

xylar commented Apr 17, 2018

@milenaveneziani, this is a PR off of #333 for filling in the XML files based on the tables on confluence.

What we want here is some basic information about which observations we use, how a user can get them, and how they can be post-processed to be usable in MPAS-Analysis. Essentially, a user should be able to start from nothing, download MPAS-Analysis (or install it from the e3sm anaconda channel), follow these instructions to set up the observations, and run MPAS-Analysis.

@xylar
Copy link
Collaborator Author

xylar commented Apr 17, 2018

Just to note: Another issue is the MOC and ice-shelf region files

@xylar
Copy link
Collaborator Author

xylar commented Apr 17, 2018

@milenaveneziani, to the degree that you have time, I would appreciate your help filling in these XML files. In particular, when I get to the point where there are instructions on how to download and install the observations, I will need your help on any data sets you know how to re-create.

@xylar xylar force-pushed the fill_in_obs_xml branch from 1d54517 to 40429ce Compare April 20, 2018 17:27
xylar added 2 commits April 20, 2018 19:35
There are no tasks that relate to it yet.
@xylar xylar force-pushed the fill_in_obs_xml branch from 40429ce to a476649 Compare April 20, 2018 17:36
Also, rename the file (it is not a table) and modify the
requested fields for each entry.
@xylar xylar force-pushed the fill_in_obs_xml branch from a476649 to 0212f86 Compare April 24, 2018 14:36
<bibtex>
</bibtex>
<dataUrls>
(missing)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@milenaveneziani, I need help finding this URL. I've wandered around the Aquarius FTP server with no luck. We need to find Aquarius_V3_SSS_Monthly.nc (or its V4 or V5 equivalent).

Copy link
Collaborator

@milenaveneziani milenaveneziani Apr 24, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apparently this file/data is no longer available, I cannot find it anywhere either. But I read a bit of the SSS data that is available now, and this seems to be interesting:
https://podaac.jpl.nasa.gov/dataset/AQUARIUS_L4_OISSS_IPRC_7DAY_V4?ids=ProcessingLevel&values=*4*&search=Aquarius
It's a blended product where the OI takes into account ARGO data as well, to correct/amend where the satellite data has a high uncertainty/error. It's also higher resolution, 0.5degx0.5deg.
It's only available as a 7-daily product (many 7-day files), so we would have to compute the monthly data ourselves as preprocessing.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, so it sounds like we just leave this as "missing" for the time being and we'll do a whole process down the road where we make a preprocessing script that hopefully can download the data (if needed) and process it into a monthly climatology. Definitely a longer-term project...

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the older one was all monthly data between 2011 and 2015 in one single file, and MPAS-Analysis would do the climatology, right?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

MPAS-Analysis would do the climatology, right?

Yes, that's right.

(missing)
</dataUrls>
<preprocessing>
(missing)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@milenaveneziani, I believe there is no preprocessing, correct?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

left preprocessing as 'missing', but I added the source link that points to the blended/OI product.

(http://www.aviso.altimetry.fr/duacs/)"
</releasePolicy>
<references>
(missing)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@milenaveneziani or @maltrud, are either of you aware of an AVISO reference we should be citing here?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Journal reference is missing and the doi page brings you to:
https://podaac.jpl.nasa.gov/dataset/AVISO_L4_DYN_TOPO_1DEG_1MO

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe just citing this pdf as a @misc?
ftp://podaac.jpl.nasa.gov/allData/aviso/L4/dynamic_topo_1deg_1mo/docs/zosTechNote_AVISO_L4_199210-201012.pdf

ocean
</component>
<description>
(missing)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@maltrud, can you help fill in all the missing info about the Trenberth and Caron MHT data set?

</component>
<description>
(missing)
</description>
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@vanroekel, can you help fill in the missing info about the 2 Nino 3.4 data sets?

}
</bibtex>
<dataUrls>
(missing, requires registration with NSIDC)
Copy link
Collaborator Author

@xylar xylar Apr 24, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@milenaveneziani, can you help me figure out what the URL would be here? Same for Bootstrap.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I'm registered but then I just see a huge number of .bin files, one for each month since 1979. There obviously was also a pre-processing script if this is what the input data was...

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, the sea-ice data is the one we (myself and @maltrud) pre-processed the most. We went from binary to netcdf files, and from the polar stereographic projection to the regular lon/lat grid. I did the first step, with a matlab script, and Mat did the second step, either with feret or ncremap, now I can't recall.
But I didn't know now a user registration was necessary..

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same for ICESat.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, well, we're going to have to redo that stuff in a python script. Wen we do that, we might want to consider staying with the polar stereographic grid like I already do for the Antarctic melt and SOSE plots (i.e. leaving the obs on that grid and just remapping MPAS data to the same grid). But clearly that's way beyond the scope of the current database.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any script(s) you have around that you want to add to the preprocess_observations directory, feel free. But I think if your scripts only document part of the preprocessing that was performed, it would be good to at least indicate here that there are missing steps.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added all scripts I have into the preprocessing_obs folder. I also added a note txt file with reminders about how we preprocessed the single seaice data sets.

(missing, requires registration with NSIDC)
</dataUrls>
<preprocessing>
(missing)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@milenaveneziani, did you have to preprocess these?

- https://neptune.gsfc.nasa.gov/uploads/files/SH_IceExt_Monthly_1978-2012.txt
</dataUrls>
<preprocessing>
(missing)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@milenaveneziani, do you have a script you used to convert these text files to NetCDF?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, it's a simple matlab script.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, might be worth adding for now (under preprocess_observations). We'll convert it to a python script when we want to use it in an automated way down the road.

}
</bibtex>
<dataUrls>
(missing, requires registration with NSIDC)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@milenaveneziani, can you figure out the URL?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm, I realize tracking future changes in the data sets will be a task in itself.. I now see different text files, which also contain ice area for specific subregions of the Arctic/SO. The last column is the total hemispheric average (as before).
Here are the URLs:
https://neptune.gsfc.nasa.gov/uploads/files/NH_IceArea_Monthly_1978-2012.txt
https://neptune.gsfc.nasa.gov/uploads/files/SH_IceArea_Monthly_1978-2012.txt

- http://psc.apl.uw.edu/wordpress/wp-content/uploads/schweiger/ice_volume/PIOMAS.2sst.monthly.Current.v2.1.txt
</dataUrls>
<preprocessing>
(missing)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@milenaveneziani, did you have a script for processing the text file into a NetCDF file?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yup, matlab again. I'll add it to the preprocessing folder.

co-located on the same 1 degree grids.
</description>
<source>
- [AVISO+ website](https://www.aviso.altimetry.fr/en/data/products/sea-surface-height-products/global/madt-h-uv.html)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would remove this link, since we got the data from the NASA JPL site. Seems confusing to me.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, fine with me. If you make any other changes to this branch, feel free to remove this link at the same time.

<releasePolicy>
[Acknowledgment:] Hurrell, J. W., J. J. Hack, D. Shea, J. M. Caron, and J. Rosinski,
2008: A New Sea Surface Temperature and Sea Ice Boundary Dataset for the Community
Atmosphere Model. Journal of Climate, 21, 5145-5153.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added this here, from email I got from UCAR.

The scripts were added to the repo but not listed in the database.
</bibtex>
<dataUrls>
(missing)
(ftp://podaac-ftp.jpl.nasa.gov/allData/aquarius/L4/IPRC/v4/7day/)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@milenaveneziani, I was thinking that these URLs would be for the individual data files, not just a directory. The idea is that the script will eventually use them to download the data. But maybe the better plan is just to hard-code the URL into the scripts themselves. Especially for cases like this where hundreds of files are needed, a list of files is impractical.

Copy link
Collaborator

@milenaveneziani milenaveneziani Apr 25, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh, there are many 7-daily files in this case..
Also, we need to think of a best way for us to maintain these scripts in the future, since changes to data sets are likely to happen perhaps more often than we would like..

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, my thinking is that we would do a completely "fresh" checkout of the analysis from time to time where we would test that all the downloading and preprocessing works as expected. Presumably this would also show us where datasets have disappeared and scripts need to be updated accordingly.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Obviously, this is well beyond the time frame of this PR ;-)

@xylar xylar changed the title Work in Progress: Fill in xml for observations Fill in xml for observations Apr 25, 2018
@xylar
Copy link
Collaborator Author

xylar commented Apr 25, 2018

@milenaveneziani, I think in the interest of getting a 0.7.5 tag, we should call this good enough and get input from @vanroekel and @maltrud as part of a separate PR. Updating this database will be an ongoing effort in any case. What do you think?

Copy link
Collaborator

@milenaveneziani milenaveneziani left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree. This is good enough for now.

@xylar
Copy link
Collaborator Author

xylar commented Apr 25, 2018

Thanks, @milenaveneziani!

@xylar xylar merged commit f2b8487 into MPAS-Dev:develop Apr 25, 2018
@xylar xylar deleted the fill_in_obs_xml branch April 25, 2018 15:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants