Skip to main content
Advanced Search

Filters: Tags: {"type":"CMS Themes"} (X)

Folders: ROOT > ScienceBase Catalog > Community for Data Integration (CDI) ( Show direct descendants )

120 results (154ms)   

View Results as: JSON ATOM CSV
thumbnail
The National Land Cover Database (NLCD) serves as the definitive Landsat-based, 30-meter resolution, land cover database for the Nation. NLCD supports a wide variety of Federal, State, local, and nongovernmental applications that seek to assess ecosystem status and health, understand the spatial patterns of biodiversity, predict effects of climate change, and develop land management policy. However, access to NLCD products for the USGS community and the public is a concern due to large file sizes, limited download options, and the expectation that users must download and analyze multiple land cover products in order to answer even basic land cover change questions. Therefore, the goal of the NLCD Evaluation, Visualization...
thumbnail
Note 9/22/18: The Adopt a Pixel concept has been incorporated into NASA's Globe Observer App (Land Cover Tool). Find out more and download the app at https://observer.globe.gov/. *** Adopt a Pixel-Data Infrastructure (AaP-DI) provides the basis for a new data acquisition system for ground reference data. These data will be used to complement existing and future remote sensing collections by providing geospatiallytagged ground-based landscape imagery and landcover of an exact location from 6 different viewing aspects. The goal is for AaP-DI to enable citizen participation in Landsat science. Principal Investigator : Ryan Longhenry, Eric C Wood Cooperator/Partner : Jeannie Allen, Virginia Butcher, Rachel Headley,...
thumbnail
ScienceCache is a scientific geocaching mobile application framework that targets two user groups for citizen science data collection: youth and geocachers. By melding training and games into the hunt for place-based data collection sites and incorporating photo uploads as data and authentication, new volunteers can collaborate in robust data collection. Scientists build a project on the administrative Web site app, specifying locations or goals for new data collection sites, clues for established sites, questions to answer, measurements, or other activities for the site based on their individual data needs. The project builds on the success of the USA National Phenology Network (NPN) and the ScienceBase project...
thumbnail
Geotagged photographs have become a useful medium for recording, analyzing, and communicating Earth science phenomena. Despite their utility, many field photographs are not published or preserved in a spatial or accessible format—oftentimes because of confusion about photograph metadata, a lack of stability, or user customization in free photo sharing platforms. After receiving a request to release about 1,210 geotagged geological field photographs of the Grand Canyon region, we set out to publish and preserve the collection in the most robust (and expedient) manner possible (fig. 6). We leveraged and reworked existing metadata, JavaScript, and Python tools and developed a toolkit and proposed workflow to display...
This project developed a set of raster utility classes and layer types for inclusion in OpenLayers to allow for statistical analysis, manipulation, and additional rendering functionality for raster data sources. The deliverables are patches for the OpenLayers development branch that include the new functionality, examples and documentation to demonstrate its use, and comprehensive unit test coverage. The intention was to get this newly developed functionality into the next stable release of OpenLayers. An additional component of an HTML5 toolkit is for the opensource JavaScript mapping framework OpenLayers. These tools are especially useful to USGS web mapping needs. This effort delivered a new set of classes within...
The purpose of this project was to establish and support a USGS Mobile Environment website to provide support of portable hardware devices, application development and application delivery. The development of a framework to fully support this endeavor will require input and involvement by Core Science Systems, Enterprise Information, Science Quality and Integrity, Office of Communication, Publishing and the mobile community. Principal Investigator : Lorna A Schmid, David L Govoni, Sky Bristol, Tim Kern Benefits One-stop shop to provide detailed support information across USGS Mission Areas Actual functioning mobile applications, built collectively Deliverables Trained Mobile Community Workshop held July 17...
Detailed information about past fire history is critical for understanding fire impacts and risk, as well as prioritizing conservation and fire management actions. Yet, fire history information is neither consistently nor routinely tracked by many agencies and states, especially on private lands in the Southeast. Remote sensing data products offer opportunities to do so but require additional processing to condense and facilitate their use by land managers. Here, we propose to generate fire history metrics from the Landsat Burned Area Products for the southeastern US. We will develop code for a processing pipeline that utilizes USGS high-performance computing resources, evaluate Amazon cloud computing services,...
The Total Water Level and Coastal Change Forecast delivers 6-day forecasts of hourly water levels and the probability of waves impacting dunes along 5000 km of sandy coasts along the Atlantic and Gulf of Mexico and will soon expand to the Pacific. These forecasts provide needed information to local governments and federal partners and are used by the USGS to place sensors before a storm. The forecast data are presented in a publicly accessible web tool and stored in a database. Currently, model data are only accessible to project staff. A growing user community is requesting direct access to the data, to conduct scientific analyses and share forecasts on other platforms. To address this need, we will develop an...
Artificial Intelligence (AI) is revolutionizing ecology and conservation by enabling species recognition from photos and videos. Our project evaluates the capacity to expand AI for individual fish recognition for population assessment. The success of this effort would facilitate fisheries analysis at an unprecedented scale by engaging anglers and citizen scientists in imagery collection.This project is one of the first attempts to apply AI towards fish population assessment with citizen science. Principal Investigator : Nathaniel P Hitt Co-Investigator : Natalya I Rapstine, Mona (Contractor) Arami, Jeff T Falgout, Benjamin Letcher, Nicholas Polys Cooperator/Partner : Sophia Liu, Fraser Hayes, Ky Wildermuth, Bryan...
thumbnail
The Community for Data Integration (CDI) Risk Map Project is developing modular tools and services to benefit a wide group of scientists and managers that deal with various aspects of risk research and planning. Risk is the potential that exposure to a hazard will lead to a negative consequence to an asset such as human or natural resources. This project builds upon a Department of the Interior project that is developing geospatial layers and other analytical results that visualize multi-hazard exposure to various DOI assets. The CDI Risk Map team has developed the following: a spatial database of hazards and assets, an API (application programming interface) to query the data, web services with Geoserver (an open-source...
thumbnail
Executive Summary Traditionally in the USGS, data is processed and analyzed on local researcher computers, then moved to centralized, remote computers for preservation and publishing (ScienceBase, Pubs Warehouse). This approach requires each researcher to have the necessary hardware and software for processing and analysis, and also to bring all external data required for the workflow over the internet to their local computer. To explore a more efficient and effective scientific workflow, we explored an alternate model: storing scientific data remotely, and performing data analysis and visualization close to the data, using only a local web browser as an interface. Although this environment was not a good fit...
thumbnail
Land-use researchers need the ability to rapidly compare multiple land-use scenarios over a range of spatial and temporal scales, and to visualize spatial and nonspatial data; however, land-use datasets are often distributed in the form of large tabular files and spatial files. These formats are not ideal for the way land-use researchers interact with and share these datasets. The size of these land-use datasets can quickly balloon in size. For example, land-use simulations for the Pacific Northwest, at 1-kilometer resolution, across 20 Monte Carlo realizations, can produce over 17,000 tabular and spatial outputs. A more robust management strategy is to store scenario-based, land-use datasets within a generalized...
thumbnail
USGS scientists often face computationally intensive tasks that require high-throughput computing capabilities. Several USGS facilities use HTCondor to run their computational pools but are not necessarily connected to the larger USGS pool. This project demonstrated how to connect HTCondor pools by flocking, or coordinating, within the USGS. In addition to flocking the Upper Midwest Environmental Science Center and the Wisconsin Water Science Center, we have flocked with the USGS Advanced Research Computing Yeti supercomputing cluster and other water science centers. We also developed tutorials on how to sandbox code using Docker within the USGS environment for use with high-throughput computing. A main accomplishment...
thumbnail
In this age of rapidly developing technology, scientific information is constantly being gathered across large spatial scales. Yet, our ability to coordinate large-scale monitoring efforts depends on development of tools that leverage and integrate multiple sources of data. North American bats are experiencing unparalleled population declines. The North American Bat Monitoring Program (NABat), a multi-national, multi-agency coordinated monitoring program, was developed to better understand the status and trends of North American bats. Similar to other large-scale monitoring programs, the ultimate success of NABat relies on a unified web-based data system. Our project successfully developed a program interface...
thumbnail
Spatial data on landslide occurrence across the U.S. varies greatly in quality, accessibility, and extent. This problem of data variability is common across USGS Mission Areas; it presents an obstacle to developing national-scale products and to identifying areas with relatively good/bad data coverage. We compiled available data of known landslides into a national-scale, searchable online map, which greatly increases public access to landslide hazard information. Additionally, we held a workshop with landslide practitioners and sought broader input from the CDI community; based on recommendations we identified a limited subset of essential attributes for inclusion in our product. We also defined a quantitative metric...
Fighting wildfires and reducing their negative effects on natural resources costs billions of dollars annually in the U.S. We will develop the Wildfire Trends Tool (WTT), a data visualization and analysis tool that will calculate and display wildfire trends and patterns for the western U.S. based on user-defined regions of interest, time periods, and ecosystem types. The WTT will be publicly available via a web application that will retrieve fire data and generate graphically compelling maps and charts of fire activity. For an area of interest, users will be able ask questions such as: Is the area burned by wildfire each year increasing or decreasing over time? Are wildfires becoming larger? Are fire seasons becoming...
A number of monitoring method and protocol libraries are currently in existence. Although these systems have been tailored to certain disciplines or research foci, the underlying principles, mechanisms, and processes have commonalities that could facilitate synthesizing content and information. The Protocol Library project consists of modifying and thus extending the capabilities of the existing NEMI methods compendium. To accomplish the task of incorporating a broad array of protocols, NEMI developers have conducted user requirement queries of protocol owners. In addition, protocol owners have been asked how they would prefer to access the data. Input forms have been created to accommodate the desires of protocol...
Wetland soils are vital to the Nation because of their role in sustaining water resources, supporting critical ecosystems, and sequestering significant concentrations of biologically-produced carbon. The United States has the world’s most detailed continent-scale digital datasets for soils and wetlands, yet scientists and land managers have long struggled with the challenge of integrating these datasets for applications in research and in resource assessment and management. The difficulties include spatial and temporal uncertainties, inconsistencies among data sources, and inherent structural complexities of the datasets. This project’s objective was to develop and document a set of methods to impute wetland...
thumbnail
Large amounts of data are being generated that require hours, days, or even weeks to analyze using traditional computing resources. Innovative solutions must be implemented to analyze the data in a reasonable timeframe. The program HTCondor (https://research.cs.wisc.edu/htcondor/) takes advantage of the processing capacity of individual desktop computers and dedicated computing resources as a single, unified pool. This unified pool of computing resources allows HTCondor to quickly process large amounts of data by breaking the data into smaller tasks distributed across many computers. This project team implemented HTCondor at the USGS Upper Midwest Environmental Sciences Center (UMESC) to leverage existing computing...
thumbnail
This project will assess the accuracy of climate drivers (precipitation and temperature) from different sources for current and future conditions. The impact of these drivers on hydrologic response will be using the monthly water balance model (MWBM). The methodology for processing and analysis of these datasets will be automated for when new climate datasets become available on the USGS Geo Data Portal (http://cida.usgs.gov/climate/gdp/ - content no longer available). This will ensure continued relevancy of project results, future opportunities for research and assessment of potential climate change impacts on hydrologic resources, and comparison between generations of climate data. To share and distribute the...