Skip to main content
Advanced Search

Filters: Tags: {"type":"CMS Themes"} (X)

Folders: ROOT > ScienceBase Catalog > Community for Data Integration (CDI) ( Show direct descendants )

120 results (80ms)   

View Results as: JSON ATOM CSV
thumbnail
USGS research in the Western Geographic Science Center has produced several geospatial datasets estimating the time required to evacuate on foot from a Cascadia subduction zone earthquake-generated tsunami in the U.S. Pacific Northwest. These data, created as a result of research performed under the Risk and Vulnerability to Natural Hazards project, are useful for emergency managers and community planners but are not in the best format to serve their needs. This project explored options for formatting and publishing the data for consumption by external partner agencies and the general public. The project team chose ScienceBase as the publishing platform, both for its ability to convert spatial data into web services...
thumbnail
The USGS 3D Elevation Program (3DEP) is managing the acquisition of lidar data across the Nation for high resolution mapping of the land surface, useful for multiple applications. Lidar data is initially collected as 3-dimensional “point clouds” that map the interaction of the airborne laser with earth surface features, including vegetation, buildings, and ground features. Generally the product of interest has been high resolution digital elevation models generated by filtering the point cloud for laser returns that come from the ground surface and removing returns from vegetation, buildings, powerlines, and other above ground features. However, there is a wealth of information in the full point cloud on vegetation...
thumbnail
USGS will soon transition to the international metadata standards known collectively as ISO 19115. The open-ended nature of ISO benefits with much greater flexibility and vocabulary to describe research products. However, that flexibility means few constraints that can guide authors and ensure standardized, robust documentation across the bureau. This project proposed that the USGS data community develop content specifications to define standard USGS ISO metadata content requirements. These specifications would be modular in order to meet documentation needs of the diverse range of research data that USGS produces. Using the specifications, metadata authors will be guided to include appropriate metadata fields for...
thumbnail
People in the locality of earthquakes are publishing anecdotal information about the shaking within seconds of their occurrences via social network technologies, such as Twitter. In contrast, depending on the size and location of the earthquake, scientific alerts can take between two to twenty minutes to publish. The goals of this project are to assess earthquake damage and effects information, as impacts unfold, by leveraging expeditious, free and ubiquitous social-media data to enhance our response to earthquake damage and effects. Principal Investigator : Michelle Guy, Paul S Earle Cooperator/Partner : Scott R Horvath, Douglas Bausch, Gregory M Smoczyk The project leverages an existing system that performs...
thumbnail
The purpose of this project is to improve the USGS Publications Warehouse (Pubs Warehouse) so that a person can search for USGS publications by geographic region in addition to existing search criteria; for example, one could search using map zooms or congressional districts. The addition of geographic searches allows users to narrow their search results to specific areas of interest, which reduces the time required to sift through all results outside the area of interest. Principal Investigator : Rex Sanders Cooperator/Partner : Qi Tong, Jenna E Nolt, James M Kreft, Frances L Lightsom, Carolyn Hayashida Degnan, Jennifer L Bruce, Ben Wheeler In FY 2014, the project team determined that the ScienceBase Footprint...
thumbnail
The United States has over 2 million dams on rivers and streams (Graf, 1999), and more than 84,000 of the larger dams are documented in the congressionally mandated National Inventory of Dams (U.S. Army Corps of Engineers, 2015). The average age of these National Inventory of Dams is 52 years; by the year 2030, over 80 percent will be at least 50 years old (American Society of Civil Engineers, 2015). As a result of this aging infrastructure, dam removal has increased during recent decades with the total number of removed dams estimated at around 1,200 (American Rivers, 2014). Many factors drive downstream physical and biological responses following dam removal, with most rivers changing rapidly and demonstrating...
thumbnail
Increasing attention is being paid to the importance of proper scientific data management and implementing processes that ensure that products being released are properly documented. USGS policies have been established to properly document not only publications, but also the related data and software. This relatively recent expansion of documentation requirements for data and software may present a daunting challenge for many USGS scientists whose major focus is their physical science and who have less expertise in information science. As a proof of concept, this project has created a software solution that facilitates this process through a user-friendly, but comprehensive, interface embedded in an existing...
thumbnail
The national Nonindigenous Aquatic Species (NAS) Database Program serves as a repository for geo-referenced occurrence data on introduced aquatic organisms across the nation. The NAS Program, including the database and website (http://nas.er.usgs.gov), is a well-known resource and has been widely referenced in peer reviewed literature, agency reports, state and national management plans, news articles and other locations. Access to NAS occurrence data is currently restricted to three channels: directly through the NAS website (primarily single occurrence records and species distribution maps), distributed databases (e.g, GBIF, BISON), and custom data queries directly through NAS program staff. The goal of this project...
thumbnail
A BioBlitz is a field survey method for finding and documenting as many species as possible in a specific area over a short period. The National Park Service and National Geographic Society hosted the largest BioBlitz survey ever in 2016; people in more than 120 national parks used the iNaturalist app on mobile devices to document organisms they observed. Resulting records have Global Positioning System (GPS) coordinates, include biological accuracy assessments, and provide an unprecedented snapshot of biodiversity nationwide. Additional processing and analysis would make these data available to inform conservation and management decisions. This project developed a process to integrate iNaturalist data with existing...
thumbnail
Lower technical and financial barriers have led to a proliferation of lidar point-cloud datasets acquired to support diverse USGS projects. The objective of this effort was to implement an open-source, cloud-based solution through USGS Cloud Hosting Solutions (CHS) that would address the needs of the growing USGS lidar community. We proposed to allow users to upload point-cloud datasets to CHS-administered Amazon Web Services storage where open-source packages Entwine and Potree would provide visualization and manipulation via a local web browser. This functionality for individual datasets would mirror services currently available for USGS 3DEP data. After the software packages could not satisfy internal technical...
We aim to migrate our research workflow from a closed system to an open framework, increasing flexibility and transparency in our science and accessibility of our data. Our hyperspectral data of agricultural crops are crucial for training/ validating machine learning algorithms to study food security, land use, etc. Generating such data is resource-intensive and requires expertise, proprietary software, and specific hardware. We will use CHS resources on their Pangeo JupyterHub to recast our data and workflows to a cloud agnostic open-source framework. Lessons learned will be shared at workshops, in reports, and on our website so others can increase the openness and accessibility of their data and workflows....
The USGS maintains an extensive monitoring network throughout the United States in order to protect the public and help manage natural resources. This network generates millions of data points each year, all of which must be evaluated and reviewed manually for quality assurance and control. Sensor malfunctions and issues can result in data losses and unexpected costs, and are typically only noticed after they occur during manual data checks. By connecting internal USGS databases to “always-on” artificial-intelligence applications, we can constantly scan data-streams for issues and predict problems before they occur. By connecting these algorithms to other cloud-hosted services, the system can automatically notify...
thumbnail
Understanding and anticipating change in dynamic Earth systems is vital for societal adaptation and welfare. USGS possesses the multidisciplinary capabilities to anticipate Earth systems change, yet our work is often bound within a single discipline and/or Mission Area. The proposed work breaks new ground in moving USGS towards an interdisciplinary predictive modeling framework. We are initially leveraging three research elements that cross the Land Resources and Water Mission Areas in an attempt to “close the loop” in modeling interactions among water, land use, and climate. Using the Delaware River Basin as a proof-of-concept, we are modeling 1) historical and future landscapes (~1850 to 2100), 2) evapotranspiration...
thumbnail
Identifying the leading edge of a biological invasion can be difficult. Many management and research entities have biological samples or surveys that may unknowingly contain data on nonindigenous species. The new Nonindigenous Aquatic Species (NAS) Database automated online tool “SEINeD” (Screen and Evaluate Invasive and Non-native Data) will allow a user to search for these nonindigenous occurrences at a push of a button. This new tool will enable stakeholders to upload a biological dataset of fishes, invertebrates, amphibians, reptiles, or aquatic plants collected anywhere in a U.S. State or Territory and screen that data for non-native aquatic species occurrences. In addition to checking for the nativity of species...
thumbnail
Geochronological data provide essential information necessary for understanding the timing of geologic processes and events, as well as quantifying rates and timescales key to geologic mapping, mineral and energy resource and hazard assessments. The USGS’s National Geochronological Database (NGDB) contains over 30,000 radiometric ages, but no formal update has occurred in over 20 years. This project is developing a database with a web-based user interface and sustainable workflow to host all USGS-generated geochronological data. This new geochronological database consists of (1) data from the existing NGDB; (2) published literature data generated by the USGS; and (3) more recent data extracted from ScienceBase...
thumbnail
Natural resources managers are regularly required to make decisions regarding upcoming restoration treatments, often based on little more than business as usual practices. To assist in the decision-making process, we created a tool that predicts site-specific soil moisture and climate for the upcoming year, and provides guidance on whether common restoration activities (i.e. seeding, planting) will be successful based on these conditions. This tool is hosted within the Land Treatment Exploration Tool (LTET), an application already used by land managers that delivers a report of site condition and treatment history. Incorporated within the short-term drought forecaster (STDF) is a rigorous statistical process,...
thumbnail
Digital Elevation Models (DEM) provide details of the earth’s surface and are used for visualization, physical modeling, and elevation change analysis. Creating DEMs in coastal environments is complicated by the highly ephemeral nature of the coast and the need to span the land-water interface. This requires merging multiple bathymetric and topographic datasets that have been collected at different times, using different instrument platforms with varying levels of accuracy, and with variable spatial resolution and coverage. Because coastal change can occur over relatively short time scales (days to weeks in the case of storms), rapid updates to coastal DEMs are also needed. These challenges and the lack of available...
thumbnail
Increasingly, USGS scientists seek to share and collaborate while working on data and code. Furthermore, these scientists often require advanced computing resources. Jupyter Notebooks are one such tool for creating these workflows. The files are interactive, code “notebooks” which allow users to combine code and text in one document, enabling scientists to share the stories held within their data. Recently, USGS launched an instance of Pangeo—a community platform for Big Data geoscience—as a tool for internally hosting and executing these notebooks. Few examples exist on how to use Pangeo and no formal documentation exists for USGS scientists to use Pangeo. We will create and curate examples of using Jupyter Notebooks...
thumbnail
Metadata Wizard THIS VERSION OF THE TOOL HAS BEEN REPLACED BY AN UPDATED VERSION Users should obtain the new version of the Metadata Wizard at the links below: The user manual is available here: https://usgs.github.io/fort-pymdwizard/index.html Software installer can be downloaded here: https://github.com/usgs/fort-pymdwizard/releases This tool will eventually replace the Metadata Wizard hosted from this page to eliminate dependencies on ESRI ArcDesktop and to enable Mac users to utilize the Metadata Wizard. Documentation and Previous Release Notes for the Legacy Publication and Product are Below ------------------------------ Metadata Wizard version: 1.8.5 (Last updated: 1/21/20) To download this toolbox...
The USGS provides many national, regional and local datasets for download, streaming interaction such as WFS/WCS, and analysis. Ultimately, most datasets are presented for visualization in "viewers" with basic navigation and interaction for inspection and even lightweight WebGIS like web service functions, annotations, etc. Many viewers–different APIs, clients, purposes, and niche functions–are invested in at USGS and DOI and the whole Federal Government. The solution is not "1 viewer" or "1 viewer API" - see the "Viewer Explosion Conundrum" below. We are stuck in a multiple viewer environment, we could recommend a few APIs, and restrict others at best. The problem with this is that when someone goes to a new viewer,...