Difference between revisions of "Develop proposal for special issue"

From Geoscience Paper of the Future
Jump to: navigation, search
([Yu 2015])
([Yu and Bhatt 2015])
Line 212: Line 212:
 
* '''Keywords of research area:''' coupled surface and subsurface flow, integrated hydrologic modeling, PIHM
 
* '''Keywords of research area:''' coupled surface and subsurface flow, integrated hydrologic modeling, PIHM
 
* '''Tentative title:''' Learning integrated modeling of surface and subsurface flow from scratch
 
* '''Tentative title:''' Learning integrated modeling of surface and subsurface flow from scratch
* '''Short abstract:''' Integrated modeling of surface and subsurface flow has been a great interest in the understanding of not only hydrological processes but also land-surface energy balance, biogeochemical and ecological processes, and landscape evolution. Although a growing number of complex hydrologic models have been used for hydrologic prediction and environmental understanding, very limited resources of the model implementation have been clearly provided in literature. The users have to invest a large volume of time and effort to reproduce and to understand the workflow of hydrologic simulation in a modeling paper. To provide a challenging and stimulating introduction to integrated modeling of surface and subsurface flow, we revisit the development of Penn State Integrated Hydrologic Model (PIHM) by reproducing a benchmarking example and a real world application. Specifically, we document PIHM workflow to enable basic understanding of modeling workflow of coupled surface and subsurface flow processes. We provide model and data to present the reciprocal roles between them. In addition, we incorporate user experience in the modeling workflow to enable deep communications between model developers and users. The workflow has implications for smoothing and accelerating open scientific collaborations in geosciences.
+
* '''Short abstract:''' Integrated modeling of surface and subsurface flow has been of great interest in understanding not only intimate interconnectedness of hydrological processes, but also land-surface energy balance, biogeochemical and ecological processes, and landscape evolution. Although a growing number of complex hydrologic models have been used for resolving environmental processes, hypothesis testing, hydrologic predictions for effective management of watershed, very limited resources of the model implementation have been made accessible to a large group of model users. The users have to invest a significant amount of time and effort to reproduce, and to understand the workflow of hydrologic simulation in a modeling paper. To provide a challenging and stimulating introduction to integrated modeling of surface and subsurface flow in this paper, we revisit the development of Penn State Integrated Hydrologic Model (PIHM) by reproducing a numerical benchmarking example, and a real world catchment scale application. Specifically, we document PIHM and it’s modeling workflow to enable basic understanding of simulating coupled surface and subsurface flow processes. We provide model and data to highlight the reciprocal roles between the two. In addition, we incorporate user experience as third dimension in the modeling workflow to enable deeper communications between model developers and users. The workflow has important implications for smoothing and accelerating open scientific collaborations in geosciences research.
 
* '''Challenge:''' Reproduce published simulations by a existing model with the latest version. Benchmarking modeling application for numerical experiment and field data.
 
* '''Challenge:''' Reproduce published simulations by a existing model with the latest version. Benchmarking modeling application for numerical experiment and field data.
 
* '''Relationship to other publications:''' The article is based on a previously published article.  
 
* '''Relationship to other publications:''' The article is based on a previously published article.  

Revision as of 13:30, 26 March 2015


Background: Why a Special Issue on Geoscience Papers of the Future?

Include here our discussion for the vision

Background should be 1-2 pages.

Motivated by need to fully document and make research accessible and reproducible.

Motivation: The EarthCube Initiative and the GeoSoft Project

Include here background about GeoSoft from the web site

OSTP memo. EarthCube reports. Other reports that talk about the need for new approaches to editing.

It's possible that small or very large contributions are not well captured in the current publishing paradigms. Nanopublications.

For example, nano-publications are a possible way to reflect advances in a research process that may not merit a full pubication but they are useful advances to share with the community. A challenge here is that there is a stigma in publishing for publishing units that are too small or very small.

Alternatively, a very large piece of research or work with many parts may be better suited to a GPF style publication.


Perhaps, the concept of a 'paper' can be better reflected in the concept of a 'wrapper' or a collection of materials and resources. The purpose is to assure that publications are representative of the work, effort, and results achieved in the research process.

What is a GPF

Include here our discussion of what is a GPF

The challenges of creating GPFs

The articles in this issue reflect the current best practice for generating a Geoscience Paper of the Future.

Figure discussions: Do we want to do exactly the same figure automatically. Figures in the paper may be a clean versions of an image generated by software. To the extent possible, authors have included clear delineations of provenance. The goal is to assure that readers may regenerate the figures using documented workflows, data, and codes. An important note (Allen, Sandra) is that frequently figures are generated by code, scripts, etc. yet the actual figure is finalized with user..... Mimi is trying to say: is it really worth belaboring the point about how the prettified version of the figure is made? If it is: both of the visualization software I've used (Matlab and SigmaPlot) have actual code in the background that specifies how to set up the prettification, and this code can be found, copied out, and rerun to generate the exact same figure with all of the prettification in the same place. SigmaPlot uses Visual Basic (I think) in its macros. If it is an important point about explicit code, this should be doable. But I'm not sure it's strictly necessary to specify exactly where all the prettifications are to get the gist across.

How much of your experimental history does one include? (Ibrahim). The experimental process often ends up nowhere. Should we document all the failed experiments? Get one DOI for the results of the successful experiment? Another for failed trials?


Documenting: Timing and Intermediate proceses When should we document and what are the bounds on what we document? For example, should we document and include data and workflows for 'failed' experiments? Or should we assign datasets DOIs before we know the results from using them? The group thinks that good ideas/practices may include documenting and sharing data when you have a clear understanding of the outcomes worth reporting. For example successful experiments should have clear, clean data documented and shared. Whereas one strategy with 'failed' experiments could include bundling the intermediate datasets with one DOI and a more general discussion of the process/methods.

Related work

Include here the related work we have discussed

Papers to be included

Would it be worthwhile to group the papers into broader categories rather than giving specifics about every single paper?

For each submission, we describe:

  • Authors and affiliations
  • Keywords of research area
  • Tentative title
  • Short abstract
  • Challenge
  • Relationship to other publications (is the article based on a previously published article? is it new content?)
  • Pointer to the wiki page that documents the article
  • Expected submission date

[David 2015]

  • Authors and affiliations: Cedric David
  • Keywords of research area:
  • Tentative title: Going beyond triple-checking, allowing for peace of mind in model development.
  • Short abstract:
  • Challenge: Ensure that updates to an existing model are able to reproduce a series of simulations published previously.
  • Relationship to other publications:
  • Pointer to the wiki page that documents the article: Page
  • Expected submission date:

[Demir 2015]

  • Authors and affiliations: Ibrahim Demir
  • Keywords of research area: hydrologic network, optimization, network representation, database query
  • Tentative title: Optimization of hydrological network representation for fast access and query in web-based system
  • Short abstract: The article is about benchmarking various network representation techniques for optimization of hydrological network access and query.
  • Challenge:
  • Relationship to other publications: The article is based on a new study
  • Pointer to the wiki page that documents the article: Page
  • Expected submission date:

[Fulweiler 2015]

  • Authors and affiliations: Wally Fulweiler
  • Keywords of research area:
  • Tentative title:
  • Short abstract:
  • Challenge:
  • Relationship to other publications: (is the article based on a previously published article? is it new content?)
  • Pointer to the wiki page that documents the article: Page
  • Expected submission date:

[Karlstrom and Lay 2015]

  • Authors and affiliations: Leif Karlstrom and Lay Kuan Loh
  • Keywords of research area:
  • Tentative title:
  • Short abstract:
  • Challenge:
  • Relationship to other publications: (is the article based on a previously published article? is it new content?)
  • Pointer to the wiki page that documents the article: Page
  • Expected submission date:

[Lee 2015]

  • Authors and affiliations: Kyo Lee
  • Keywords of research area:
  • Tentative title:
  • Short abstract:
  • Challenge:
  • Relationship to other publications: (is the article based on a previously published article? is it new content?)
  • Pointer to the wiki page that documents the article: Page
  • Expected submission date:

[Miller 2015]

  • Authors and affiliations: Kim Miller
  • Keywords of research area:
  • Tentative title:
  • Short abstract:
  • Challenge:
  • Relationship to other publications: (is the article based on a previously published article? is it new content?)
  • Pointer to the wiki page that documents the article: Page
  • Expected submission date:

[Mills 2015]

  • Authors and affiliations: Heath Mills, University of Houston Clear Lake; Brandi Kiel Reese, Texas A&M Corpus Christi
  • Keywords of research area:
  • Tentative title:Iron and Sulfur Cycling Biogeography Using Advanced Geochemical and Molecular Analyses
  • Short abstract:
  • Challenge: My paper will develop and document a new pipeline to analyze a combined and robust genetic and geochemical data set. New, reproducible methods will be highlighted in this manuscript to help others better analyze similar data sets. There is a general lack of guidance within my field for such challenges. This manuscript will be unique and helpful from an analysis standpoint as well as for the science being presented.
  • Relationship to other publications: Original Manuscript
  • Pointer to the wiki page that documents the article: Page
  • Expected submission date:

[Oh 2015]

  • Authors and affiliations: Ji-Hyun Oh Jet Propulsion Laboratory/University of Southern California
  • Keywords of research area: Tropical Meteorology, Madden-Julian Oscillation, Momentum budget analysis
  • Tentative title: Tools for computing momentum budget for the westerly wind event associated with the Madden-Julian Oscillation
  • Short abstract:
  • Challenge: This paper will cover how to reproduce two key figures from the paper that I recently submitted to Journal of Atmospheric Science. This will include detailed procedures related to generating the figures such as how/where to download data, how to transform the format of the data to be used as an input for my codes, and so on..
  • Relationship to other publications: (is the article based on a previously published article? is it new content?) This article is related to the part of the paper submitted to Journal of Atmospheric Science.
  • Pointer to the wiki page that documents the article: Page
  • Expected submission date:

[Pierce 2015]

  • Authors and affiliations: Suzanne Pierce
  • Keywords of research area:
  • Tentative title:
  • Short abstract:
  • Challenge: Fully document a new software application and framework using example case study data and tutorials.
  • Relationship to other publications: (is the article based on a previously published article? is it new content?)
  • Pointer to the wiki page that documents the article: Page
  • Expected submission date:

[Pope 2015]

  • Authors and affiliations: Allen Pope, National Snow and Ice Data Center, University of Colorado, Boulder
  • Keywords of research area: Glaciology, Remote Sensing, Landsat 8, Polar Science
  • Tentative title: Data and Code for Estimating and Evaluating Supraglacial Lake Depth With Landsat 8 and other Multispectral Sensors
  • Short abstract: Supraglacial lakes play a significant role in glacial hydrological systems – for example, transporting water to the glacier bed in Greenland or leading to ice shelf fracture and disintegration in Antarctica. To investigate these important processes, multispectral remote sensing provides multiple methods for estimating supraglacial lake depth – either through single-band or band-ratio methods, both empirical and physically-based. Landsat 8 is the newest satellite in the Landsat series. With new bands, higher dynamic range, and higher radiometric resolution, the Operational Land Imager (OLI) aboard Landsat 8 has a lot of potential.

This paper will document the data and code used in processing in situ reflectance spectra and depth measurements to investigate the ability of Landsat 8 to estimate lake depths using multiple methods, as well as quantify improvements over Landsat 7’s ETM+. A workflow, data, and code are provided to detail promising methods as applied to Landsat 8 OLI imagery of case study areas in Greenland, allowing calculation of regional volume estimates using 2013 and 2014 summer-season imagery. Altimetry from WorldView DEMs are used to validate lake depth estimates. The optimal method for supraglacial lake depth estimation with Landsat 8 is shown to be an average of single band depths by red and panchromatic bands. With this best method, preliminary investigation of seasonal behavior and elevation distribution of lakes is also discussed and documented.

  • Challenge: Reproducibility, Dark Code
  • Relationship to other publications: Documenting and explaining the data and code behind the analysis and results presented in another paper.
  • Pointer to the wiki page that documents the article: Page
  • Expected submission date: Late June 2015

[Read and Winslow 2015]

  • Authors and affiliations: Jordan Read and Luke Winslow
  • Keywords of research area:
  • Tentative title:
  • Short abstract:
  • Challenge:
  • Relationship to other publications: (is the article based on a previously published article? is it new content?)
  • Pointer to the wiki page that documents the article: Page
  • Expected submission date:

[Tzeng 2015]

  • Authors and affiliations: Mimi Tzeng, Brian Dzwonkowski (DISL); Kyeong Park (TAMU Galveston)
  • Keywords of research area:physical oceanography, remote sensing
  • Tentative title: Fisheries Oceanography of Coastal Alabama (FOCAL): A Subset of a Time-Series of Hydrographic and Current Data from a Permanent Moored Station Outside Mobile Bay (27 Jan to 18 May 2011)
  • Short abstract:The Fisheries Oceanography in Coastal Alabama (FOCAL) program began in 2006 as a way for scientists at Dauphin Island Sea Lab (DISL) to study the natural variability of Alabama's nearshore environment as it relates to fisheries production. FOCAL provided a long-term baseline data set that included time-series hydrographic data from a permanent offshore mooring (ADCP, vertical thermister array and CTDs at surface and bottom) and shipboard surveys (vertical CTD profiles and water sampling), as well as monthly ichthyoplankton and zooplankton (depth-discrete) sample collections at FOCAL sites. The subset of data presented here are from the mooring, and includes a vertical array of thermisters, CTDs at surface and bottom, an ADCP at the bottom, and vertical CTD profiles collected at the mooring during maintenance surveys. The mooring is located at 30 05.410'N 88 12.694'W, 25 km southwest of the entrance to Mobile Bay. Temperature, salinity, density, depth, and current velocity data were collected at 20-minute intervals from 2006 to 2012. Other parameters, such as dissolved oxygen, are available for portions of the time series depending on which instruments were deployed at the time.
  • Challenge: My paper will be about the processing of data in a larger dataset, from which peer-reviewed papers have been written. The processing I did was not specific to any particular paper. I can point to an example paper that used some of the data from this dataset, that I processed, however all of the figures in the paper are composites that also include other data from elsewhere that I had nothing to do with (and it wouldn't be feasible to try to get hold of the other data within our timeframe).
  • Relationship to other publications: A recent paper that used the part of the FOCAL data I'm documenting as the sample from the larger dataset: Dzwonkowski, Brian, Kyeong Park, Jungwoo Lee, Bret M. Webb, and Arnoldo Valle-Levinson. 2014. "Spatial variability of flow over a river-influenced inner shelf in coastal Alabama during spring." Continental Shelf Research 74:25-34.
  • Pointer to the wiki page that documents the article: Page
  • Expected submission date:

[Villamizar 2015]

  • Authors and affiliations: Sandra Villamizar, University of California, Merced
  • Keywords of research area: river ecohydrology
  • Tentative title: Producing long-term series of whole-stream metabolism using readily available data.
  • Short abstract: Continuous water quality and discharge data that are readily available through government websites may be used to produce useful information about the processes within a river ecosystem. This paper will provide a detailed description on how to produce a long-term series of whole stream metabolism for the case of the restoration reach of the San Joaquin River in California.
  • Challenge: Document new software/applications
  • Relationship to other publications: This will be a new publication
  • Pointer to the wiki page that documents the article: Page
  • Expected submission date: To be defined

[Yu and Bhatt 2015]

  • Authors and affiliations: Xuan Yu, Department of Geological Sciences, University of Delaware. Gopal Bhatt, Department of Civil & Environmental Engineering, Pennsylvania State University.
  • Keywords of research area: coupled surface and subsurface flow, integrated hydrologic modeling, PIHM
  • Tentative title: Learning integrated modeling of surface and subsurface flow from scratch
  • Short abstract: Integrated modeling of surface and subsurface flow has been of great interest in understanding not only intimate interconnectedness of hydrological processes, but also land-surface energy balance, biogeochemical and ecological processes, and landscape evolution. Although a growing number of complex hydrologic models have been used for resolving environmental processes, hypothesis testing, hydrologic predictions for effective management of watershed, very limited resources of the model implementation have been made accessible to a large group of model users. The users have to invest a significant amount of time and effort to reproduce, and to understand the workflow of hydrologic simulation in a modeling paper. To provide a challenging and stimulating introduction to integrated modeling of surface and subsurface flow in this paper, we revisit the development of Penn State Integrated Hydrologic Model (PIHM) by reproducing a numerical benchmarking example, and a real world catchment scale application. Specifically, we document PIHM and it’s modeling workflow to enable basic understanding of simulating coupled surface and subsurface flow processes. We provide model and data to highlight the reciprocal roles between the two. In addition, we incorporate user experience as third dimension in the modeling workflow to enable deeper communications between model developers and users. The workflow has important implications for smoothing and accelerating open scientific collaborations in geosciences research.
  • Challenge: Reproduce published simulations by a existing model with the latest version. Benchmarking modeling application for numerical experiment and field data.
  • Relationship to other publications: The article is based on a previously published article.
  • Pointer to the wiki page that documents the article: Page
  • Expected submission date: End of June 2015

Special Issue Editors

  • Co-editor: Chris Duffy and/or Scott Peckham
  • Co-editor: Cedric David
  • Co-editor: possibly Karan Venayagamoorthy

The editors will only accept submissions that follow the special issue review criteria.

The editors will select a set of reviewers to handle the submissions. Reviewers will include computer scientists, library scientists, and geoscientists.

Special Issue Review Criteria

The reviewers will be asked to provide feedback on the papers according to the following criteria. Note that some papers will have good reasons for limiting the information (e.g. the data is from third parties and not openly available, etc), and in that case they would document those reasons.

  • Documentation of the datasets: descriptions of datasets, unique identifiers, repositories.
  • Documentation of software: description of all software used (including pre-processing of data, visualization steps, etc), unique identifiers, repositories.
  • Documentation of the provenance of results: provenance for each figure or result, such as the workflow or the provenance record.

Tentative Timeline

  • Journal committed to special issue: April 15, 2015
  • Submissions due to editors: June 30, 2015
  • Reviews due: Sept 15, 2015
  • Decisions out to authors: Sept 30, 2015
  • Revisions due: October 31, 2015
  • Final versions due November 15, 2015
  • Issue published December 31, 2015