Skip to main content
Advanced Search

Filters: Tags: Guidance Document (X)

20 results (159ms)   

View Results as: JSON ATOM CSV
This project aimed to advance the long-standing need for a more formalized approach to data management planning at the science center (program) level in USGS. The study used two different science centers as test cases. Improved planning for data management and data integration is identified in the Bureau science strategy goals (U.S. Geological Survey, 2007; Burkett and others, 2011) with the need for consistent and unified data management to allow for accessible and high confidence data and information from the USGS science community. Benefits Two data management models for other science centers to use Data management framework tested by use case scenario Deliverables A Science Center review on data management...
thumbnail
Over the last few years, the ISO 19115 family of metadata standards has become the predominantly accepted worldwide standard for sharing information about the availability and usability of scientific datasets among researchers. The U.S. interests in the ISO standard have also been growing as global-scale science demands participation with the broader international community; however, adoption has been slow because of the complexity and rigor of the ISO metadata standards. In addition, support for the standard in current implementations has been minimal. In 2009, the Alaska Data Integration Working Group members (ADIwg) mobilized to jointly address common data integration efforts. Beginning in 2012, ADIwg started...
thumbnail
FAIR is an international set of principles for improving the findability, accessibility, interoperability, and reusability of research data and other digital products. The PIs for this CDI project planned and hosted a workshop of USGS data stakeholders, data professionals, and managers of USGS data systems from across the Bureau’s Mission Areas. Workshop participants shared case studies that fostered collaborative discussions, resulting in recommended actions and goals to make USGS research data more FAIR. Project PIs are using the workshop results to produce a roadmap for adopting FAIR principles in USGS. The FAIR Roadmap will be foundational to FY2021 CDI activities to ensure the persistence and usability of...
A number of monitoring method and protocol libraries are currently in existence. Although these systems have been tailored to certain disciplines or research foci, the underlying principles, mechanisms, and processes have commonalities that could facilitate synthesizing content and information. The Protocol Library project consists of modifying and thus extending the capabilities of the existing NEMI methods compendium. To accomplish the task of incorporating a broad array of protocols, NEMI developers have conducted user requirement queries of protocol owners. In addition, protocol owners have been asked how they would prefer to access the data. Input forms have been created to accommodate the desires of protocol...
thumbnail
A unique opportunity for USGS to collaborate with IRIS-PASSCAL (the national seismic instrument facility) has presented itself to develop a geophysical data archive format that follows FAIR principles. IRIS-PASSCAL is extending facility to include magnetotelluric (MT) instruments prescribing the need for them to archive collected MT data by extending their existing protocol. Concurrently, Congress has mandated the USGS to collect nationwide MT data (5000 stations) which will all need to be archived under FAIR principles. In collaboration with IRIS-PASSCAL, we propose to develop a generalized HDF5 format for archiving MT data which can easily be extended to other geophysical data in the future. This project will...
thumbnail
The purpose of this project was to support the enhanced search, access, and visualization capability for disaster maps and other contributed products on the public USGS Hazards Data Distribution System (HDDS) (U.S. Geological Survey, 2015). These products are often provided to USGS by collaborators for sharing across the response community during the course of an emergency event response; however, in the past, they were not easy for users to discover or access. This project involved the design, testing, and delivery of a new capability for HDDS to ingest, catalog, and display informational or value-added products when provided in a variety of formats. As a result of this work, the user community will be able to...
The Fort Collins Science Center Web Applications Team, the Core Science Analytics and Synthesis unit of the Core Science Systems Mission Area, and a North Central Climate Science Center/NCAR/NOAA partnership group collaborated on a set of automated tools to allow remapping of the Federal Geographic Data Committee (FGDC) metadata standards to International Organization for Standardization (ISO) metadata standards. This project addressed the challenge of expediting the conversion of millions of metadata records in multiple USGS catalogs that run the risk of being left in a deprecated transfer format. The project set the stage for metadata conversions by: 1. Providing roadmaps and automated processes to remap FGDC...
2012 Updates - Phase 2 (information from the FY12 CDI Annual Report) This project solicited input from USGS Mission Areas, Geographic Areas, CDI, etc. on Phase 1 FY11 Data Management Education Products. The proposal called for an interface with Data Management Website Working Group to make materials available. The work also included the development of content for a USGS data management training program based upon existing materials and data management training. Finally, development of a format/structure for data management training workshop was completed. Benefits Inform and encourage broadest possible application of data Management best practices Knowledge of data management practices in the Survey will increase...
thumbnail
The purpose of the Data Management Training (DMT) Clearinghouse project was twofold. First, the project aimed to increase discoverability and accessibility of the wealth of learning resources that have been developed to inform and train scientists about data management in the Earth sciences. Secondly, the project team wanted to facilitate the use of these learning resources by providing descriptive information (metadata) that can help research scientists, students, or teachers assess whether the resource would be appropriate and useful for their needs. The project team established the following objectives for the project: Create an online, searchable, and browsable clearinghouse of learning resources on data...
thumbnail
The purpose of this project was to document processes for USGS scientists to organize and share data using ScienceBase, and to provide an example interactive mapping application to display those data. Data and maps from Chase and others (2016a, b) were used for the example interactive maps. Accomplishments The accomplishments for this project are described below. The project team developed an interactive mapping application in R that connects to data on ScienceBase, using Shiny, Leaflet (Cheng and Xie, 2016), and sbtools (Winslow and others, 2016) (fig. 10). USGS scientists can refer to the R code in the mapping application to build their own interactive maps. Code is available at the USGS Bitbucket Repository...
thumbnail
Large online data catalogs use controlled vocabularies to categorize datasets in ways that allow end users to sort and select data matching their needs. The eventual goal of this project is to build functional services so that the USGS Thesaurus and other USGS-controlled vocabularies will be available to the English-speaking scientific community, especially within the USGS where they can be used to improve metadata quality and data discovery. The project team used the Tetherless World Constellation (TWC) Semantic Web Methodology, which is designed to examine use cases and determine both functional and nonfunctional system requirements without prejudicial commitments to meeting those requirements by utilizing particular...
thumbnail
The cloud offers new and exciting opportunities for USGS employees to leverage computing resources and services that can quickly improve their workflows and reduce expenditures typically associated with establishing a comparable environment with physical infrastructure. However, due to the novelty of access to and use of the cloud environment, there is limited documentation and shared examples detailing how those resources and services are being used across the USGS. Developing a platform that allows cloud users to contribute to the available documentation and provides a location to consolidate information relevant to operating in the USGS cloud will help to decrease duplication of efforts across projects that share...
thumbnail
USGS will soon transition to the international metadata standards known collectively as ISO 19115. The open-ended nature of ISO benefits with much greater flexibility and vocabulary to describe research products. However, that flexibility means few constraints that can guide authors and ensure standardized, robust documentation across the bureau. This project proposed that the USGS data community develop content specifications to define standard USGS ISO metadata content requirements. These specifications would be modular in order to meet documentation needs of the diverse range of research data that USGS produces. Using the specifications, metadata authors will be guided to include appropriate metadata fields for...
The Center for Biological Informatics (CBI, now reorganized into the Core Science Analytics and Synthesis unit of the Core Science Systems Mission Area) and the Fort Collins Science Center Web Applications Team joined resources to integrate their respective Metadata Tool and ScienceBase system: Incorporated the CBI-developed Metadata Tool into ScienceBase metadata workflow. This work involved diagraming how several independently developed systems can work together to manage data in the USGS, and make it available to partners and the public. See the illustration of the USGS data management workflow from the 2011 CDI Annual Report (link in Resource section below). Designed and built a rich user interface client that...
2012 Updates (from the 2012 CDI Annual Review): This proposal would continue the building of website content, including: best practices, tools, recommended reading, and data management planning tool. It also includes a usability testing on the website. It also calls for on-going maintenance of the website and its content. Benefits USGS researchers will have easy access to the standards, tools, and best practices Centralized, CDI-vetted reference for scientists Deliverables A Data Management Web site: www.usgs.gov/datamanagement/ Poster presented and demonstration given at CDI DataBlast (July 2012) Presentation given at CDI-hosted Webinar (September 2012) Internal CDI Project Progress Report 2011 tasks (from...
thumbnail
Increasingly, USGS scientists seek to share and collaborate while working on data and code. Furthermore, these scientists often require advanced computing resources. Jupyter Notebooks are one such tool for creating these workflows. The files are interactive, code “notebooks” which allow users to combine code and text in one document, enabling scientists to share the stories held within their data. Recently, USGS launched an instance of Pangeo—a community platform for Big Data geoscience—as a tool for internally hosting and executing these notebooks. Few examples exist on how to use Pangeo and no formal documentation exists for USGS scientists to use Pangeo. We will create and curate examples of using Jupyter Notebooks...
thumbnail
Prior to this project, data acquired from the USGS Unmanned Aircraft Systems (UAS) have been provided to requesting scientists but have not been made available to the broader USGS community, the U.S. Department of the Interior (DOI) bureaus, or the public at large. This project performed a pilot study and developed a strategy that is scalable to evolve into a permanent UAS data management capability. The goal is to make UAS datasets available over the Internet to the USGS, DOI, and public science communities by establishing robust data management strategies and integrating these data with other geospatial datasets in the existing infrastructure located at the USGS EROS Data Center. Accomplishments The accomplishments...
The scientific legacy of the USGS is the data and the scientific knowledge derived from it gathered over 130 years of research. However, it is widely assumed, and in some cases known, that high quality data, particularly legacy data critical for large time-scale analyses such as climate change and habitat change, is hidden away in case files, file cabinets, and hard drives housed in USGS science centers and field stations (both hereafter “science centers”). Many USGS science centers, such as the Fort Collins Science Center, have long, established research histories, are known repositories of data sets, and conduct periodic “file room cleanout” days that establish and enforce some minimal data lifecycle management...
As research and management of natural resources shift from local to regional and national scales, the need for information about aquatic systems to be summarized to multiple scales is becoming more apparent. Recently, four federally funded national stream assessment efforts (USGS Aquatic GAP, USGS National Water-Quality Assessment Program, U.S. Environmental Protection Agency [EPA] StreamCat, and National Fish Habitat Partnership) identified and summarized landscape information into two hydrologically and ecologically significant scales of local and network catchments for the National Hydrography Dataset Plus (NHDPlus). These efforts have revealed a significant percentage of assessment funds being directed to the...
Drought is a major problem in the American Southwest that is expected to worsen under the effects of climate change. Currently, the Southwest Biological Science Center is monitoring the effects of drought with soil moisture probes in a range of ecosystems across an elevational gradient on the Colorado Plateau. These data are used in multiple studies to analyze the effects of drought on vegetation composition and demography. Accessing and analyzing that data still relies on traditional site visits, which can result in delayed recognition of erroneous data, a common pitfall in many field-science operations. We propose to improve upon this traditional data workflow by leveraging the use of Internet of Things (IoT)...