Skip to main content
Advanced Search

Filters: Tags: Computational Tools and Services (X)

96 results (106ms)   

Filters
Date Range
Extensions
Types
Contacts
Categories
Tag Types
Tag Schemes
View Results as: JSON ATOM CSV
The Total Water Level and Coastal Change Forecast delivers 6-day forecasts of hourly water levels and the probability of waves impacting dunes along 5000 km of sandy coasts along the Atlantic and Gulf of Mexico and will soon expand to the Pacific. These forecasts provide needed information to local governments and federal partners and are used by the USGS to place sensors before a storm. The forecast data are presented in a publicly accessible web tool and stored in a database. Currently, model data are only accessible to project staff. A growing user community is requesting direct access to the data, to conduct scientific analyses and share forecasts on other platforms. To address this need, we will develop an...
Artificial Intelligence (AI) is revolutionizing ecology and conservation by enabling species recognition from photos and videos. Our project evaluates the capacity to expand AI for individual fish recognition for population assessment. The success of this effort would facilitate fisheries analysis at an unprecedented scale by engaging anglers and citizen scientists in imagery collection.This project is one of the first attempts to apply AI towards fish population assessment with citizen science. Principal Investigator : Nathaniel P Hitt Co-Investigator : Natalya I Rapstine, Mona (Contractor) Arami, Jeff T Falgout, Benjamin Letcher, Nicholas Polys Cooperator/Partner : Sophia Liu, Fraser Hayes, Ky Wildermuth, Bryan...
thumbnail
The Community for Data Integration (CDI) Risk Map Project is developing modular tools and services to benefit a wide group of scientists and managers that deal with various aspects of risk research and planning. Risk is the potential that exposure to a hazard will lead to a negative consequence to an asset such as human or natural resources. This project builds upon a Department of the Interior project that is developing geospatial layers and other analytical results that visualize multi-hazard exposure to various DOI assets. The CDI Risk Map team has developed the following: a spatial database of hazards and assets, an API (application programming interface) to query the data, web services with Geoserver (an open-source...
thumbnail
Executive Summary Traditionally in the USGS, data is processed and analyzed on local researcher computers, then moved to centralized, remote computers for preservation and publishing (ScienceBase, Pubs Warehouse). This approach requires each researcher to have the necessary hardware and software for processing and analysis, and also to bring all external data required for the workflow over the internet to their local computer. To explore a more efficient and effective scientific workflow, we explored an alternate model: storing scientific data remotely, and performing data analysis and visualization close to the data, using only a local web browser as an interface. Although this environment was not a good fit...
thumbnail
Land-use researchers need the ability to rapidly compare multiple land-use scenarios over a range of spatial and temporal scales, and to visualize spatial and nonspatial data; however, land-use datasets are often distributed in the form of large tabular files and spatial files. These formats are not ideal for the way land-use researchers interact with and share these datasets. The size of these land-use datasets can quickly balloon in size. For example, land-use simulations for the Pacific Northwest, at 1-kilometer resolution, across 20 Monte Carlo realizations, can produce over 17,000 tabular and spatial outputs. A more robust management strategy is to store scenario-based, land-use datasets within a generalized...
thumbnail
USGS scientists often face computationally intensive tasks that require high-throughput computing capabilities. Several USGS facilities use HTCondor to run their computational pools but are not necessarily connected to the larger USGS pool. This project demonstrated how to connect HTCondor pools by flocking, or coordinating, within the USGS. In addition to flocking the Upper Midwest Environmental Science Center and the Wisconsin Water Science Center, we have flocked with the USGS Advanced Research Computing Yeti supercomputing cluster and other water science centers. We also developed tutorials on how to sandbox code using Docker within the USGS environment for use with high-throughput computing. A main accomplishment...
thumbnail
In this age of rapidly developing technology, scientific information is constantly being gathered across large spatial scales. Yet, our ability to coordinate large-scale monitoring efforts depends on development of tools that leverage and integrate multiple sources of data. North American bats are experiencing unparalleled population declines. The North American Bat Monitoring Program (NABat), a multi-national, multi-agency coordinated monitoring program, was developed to better understand the status and trends of North American bats. Similar to other large-scale monitoring programs, the ultimate success of NABat relies on a unified web-based data system. Our project successfully developed a program interface...
thumbnail
Spatial data on landslide occurrence across the U.S. varies greatly in quality, accessibility, and extent. This problem of data variability is common across USGS Mission Areas; it presents an obstacle to developing national-scale products and to identifying areas with relatively good/bad data coverage. We compiled available data of known landslides into a national-scale, searchable online map, which greatly increases public access to landslide hazard information. Additionally, we held a workshop with landslide practitioners and sought broader input from the CDI community; based on recommendations we identified a limited subset of essential attributes for inclusion in our product. We also defined a quantitative metric...
Fighting wildfires and reducing their negative effects on natural resources costs billions of dollars annually in the U.S. We will develop the Wildfire Trends Tool (WTT), a data visualization and analysis tool that will calculate and display wildfire trends and patterns for the western U.S. based on user-defined regions of interest, time periods, and ecosystem types. The WTT will be publicly available via a web application that will retrieve fire data and generate graphically compelling maps and charts of fire activity. For an area of interest, users will be able ask questions such as: Is the area burned by wildfire each year increasing or decreasing over time? Are wildfires becoming larger? Are fire seasons becoming...
thumbnail
Large amounts of data are being generated that require hours, days, or even weeks to analyze using traditional computing resources. Innovative solutions must be implemented to analyze the data in a reasonable timeframe. The program HTCondor (https://research.cs.wisc.edu/htcondor/) takes advantage of the processing capacity of individual desktop computers and dedicated computing resources as a single, unified pool. This unified pool of computing resources allows HTCondor to quickly process large amounts of data by breaking the data into smaller tasks distributed across many computers. This project team implemented HTCondor at the USGS Upper Midwest Environmental Sciences Center (UMESC) to leverage existing computing...
thumbnail
This project will assess the accuracy of climate drivers (precipitation and temperature) from different sources for current and future conditions. The impact of these drivers on hydrologic response will be using the monthly water balance model (MWBM). The methodology for processing and analysis of these datasets will be automated for when new climate datasets become available on the USGS Geo Data Portal (http://cida.usgs.gov/climate/gdp/ - content no longer available). This will ensure continued relevancy of project results, future opportunities for research and assessment of potential climate change impacts on hydrologic resources, and comparison between generations of climate data. To share and distribute the...
thumbnail
How can the public discover opportunities for participation in USGS scientific research? What citizen science projects are currently active within the USGS? How may PIs increase public engagement in and awareness of their citizen science projects? To address these questions, a web application leveraging existing Community for Data Integration (CDI) and USGS work was created to allow unprecedented public access to USGS citizen science project metadata and highlights of key science outcomes. Such an application enables, for the first time, high-visibility, unified open access to information about projects and practices related to citizen participation in USGS research. The need for such information was identified...
thumbnail
The purpose of this project was to support the enhanced search, access, and visualization capability for disaster maps and other contributed products on the public USGS Hazards Data Distribution System (HDDS) (U.S. Geological Survey, 2015). These products are often provided to USGS by collaborators for sharing across the response community during the course of an emergency event response; however, in the past, they were not easy for users to discover or access. This project involved the design, testing, and delivery of a new capability for HDDS to ingest, catalog, and display informational or value-added products when provided in a variety of formats. As a result of this work, the user community will be able to...
thumbnail
Science is an increasingly collaborative endeavor. In an era of Web-enabled research, new tools reduce barriers to collaboration across traditional geographic and disciplinary divides and improve the quality and efficiency of science. Collaborative online code management has moved project collaboration from a manual process of email and thumb drives into a traceable, streamlined system where code can move directly from the command-line onto the Web for discussion, sharing, and open contributions. Within the USGS, however, data have no such analogous system. To bring data collaboration and sharing within the USGS to the next level, we are missing crucial components. The sbtools project team built sbtools, an R interface...
thumbnail
We developed an Internet of Things (IoT) prototype and associated cloud infrastructure for camera-based data collection and initial processing of river streamflow using the cloud (fig. 1). This pilot successfully created a hardware and cloud infrastructure to collect and upload video from a camera gage at San Pedro Creek in San Antonio, Texas. Using a ThingLogix Foundry instance in the Amazon Webservices Cloud, we have created a cloud framework that can auto-provision new camera-based gaging equipment, as well as process incoming videos into image frames for the computation of streamflow. Additionally, we began testing of serving timeseries data from a camera gage (water level and CPU temperature) using real-time...
thumbnail
Access to up-to-date geospatial data is critical when responding to natural hazards-related crises, such as volcanic eruptions. To address the need to reliably provide access to near real-time USGS datasets, we developed a process to allow data managers within the USGS Volcano Hazard Program to programmatically publish geospatial webservices to a cloud-based instance of GeoServer hosted on Amazon Web Services (AWS), using ScienceBase. To accomplish this, we developed a new process in the ScienceBase application, added new functionality to the ScienceBase Python library (sciencebasepy), and assembled a functioning Python workflow demonstrating how users can gather data from a web API and publish these data as a cloud-based...
thumbnail
The purpose of this project was to integrate the Bat Banding Program data (1932-1972) and the U.S. and Canada diagnostic data for white-nose syndrome with the USGS Bat Population Data (BPD) Project and provide the bat research community with secure, role-based access to these previously unavailable datasets. The objectives of this project were to: 1) integrate WNS diagnostic data into the BPD (http://my.usgs.gov/bpd - content no longer available); 2) incorporate the historical bat banding data produced by the Bat Banding Program into the BPD; and, 3) develop the application programming interfaces (APIs) and data services required to share these datasets with DOI and USGS enterprise data resources, BISON and Sciencebase....
The California Climate Commons (CCC) and USGS Geo Data Portal (GDP) teams have collaborated to curate and host California and Great Basin Characterization Model (BCM) results. The CCC has successfully set up a web-server and installed needed software to serve these model results using data and web service standards that are compatible with the GDP. All raw monthly data has been transferred to the GDP team for processing and metadata development for hosting on the GDP. The GDP and CCC teams have made significant progress in converting raw BCM model data to archive formats and are moving forward as planned. The project experienced delays in transferring funds to the Point Blue Conservation Science team responsible...
The Fort Collins Science Center Web Applications Team, the Core Science Analytics and Synthesis unit of the Core Science Systems Mission Area, and a North Central Climate Science Center/NCAR/NOAA partnership group collaborated on a set of automated tools to allow remapping of the Federal Geographic Data Committee (FGDC) metadata standards to International Organization for Standardization (ISO) metadata standards. This project addressed the challenge of expediting the conversion of millions of metadata records in multiple USGS catalogs that run the risk of being left in a deprecated transfer format. The project set the stage for metadata conversions by: 1. Providing roadmaps and automated processes to remap FGDC...
thumbnail
Web portals are one of the principal ways geospatial information can be communicated to the public. A few prominent USGS examples are the Geo Data Portal (http://cida.usgs.gov/gdp/ [URL is accessible with Google Chrome]), EarthExplorer (http://earthexplorer.usgs.gov/), the former Derived Downscaled Climate Projection Portal, the Alaska Portal Map (http://alaska.usgs.gov/portal/), the Coastal Change Hazards Portal (http://marine.usgs.gov/coastalchangehazardsportal/), and The National Map (http://nationalmap.gov/). Currently, web portals are developed at relatively high effort and cost, with web developers working with highly skilled data specialists on custom solutions that meet user needs. To address this issue,...