Skip to main content
Advanced Search

Filters: Tags: Computational Tools and Services (X)

96 results (152ms)   

Filters
Date Range
Extensions
Types
Contacts
Categories
Tag Types
Tag Schemes
View Results as: JSON ATOM CSV
thumbnail
U.S. Geological Survey (USGS) scientists are at the forefront of research that is critical for decision-making, particularly through the development of models (Bayesian networks, or BNs) that forecast coastal change. The utility of these tools outside the scientific community has been limited because they rely on expensive, technical software and a moderate understanding of statistical analyses. We proposed to convert one of our models from proprietary to freely available open-source software, resulting in a portable interactive web-interface. The resulting product will serve as a prototype to demonstrate how interdisciplinary USGS science and models can be transformed into an approachable format for decision-makers....
thumbnail
Ice jams are a major hazard. The project team worked with the US Army Corps of Engineers, National Weather Service, Silver Jackets, and USGS stakeholders to develop a mobile-friendly prototype of an Ice Jam Hazard website and reporting system. The prototype shows how ice jam conditions can be recorded nationwide. The public can view and download ice jam information. Historic ice jam locations and frequencies, as well as potentially hazardous developing ice jams, are valuable data. Given the science, modeling, and hazard warning potential provided by this data, continued development of this system is widely supported. The prototype system consists of an Angular-Material framework javascript client hosted on Amazon...
thumbnail
The goal of this project was to develop a novel methodology to combine the USGS Gap Analysis Program (GAP) national land cover and species distribution data with disturbance data to describe and predict how disturbance affects biodiversity. Specifically, the project team presented a case study examining how energy development in the Williston Basin can affect grassland birds; however, the methods developed are scalable and transferable to other types of habitat conversion (anthropogenic or natural), regions, and taxa. This project had six key components: Develop a dataset delineating all oil well pads in the Williston Basin. Develop a habitat conversion tool to determine the amount and previous land cover from...
Wildfires are increasing across the western U.S., causing damage to ecosystems and communities. Addressing the fire problem requires understanding the trends and drivers of fire, yet most fire data is limited only to recent decades. Tree-ring fire scars provide fire records spanning 300-500 years, yet these data are largely inaccessible to potential users. Our project will deliver the newly compiled North American Fire Scar Network — 2,592 sites, 35,602 trees, and > 300,000 fire records — to fire scientists, managers and the public through an online application that will provide tools to explore, visualize, and analyze fire history data. The app will provide raw and derived data products, graphics, statistical summaries,...
thumbnail
The goal of this project was to maximize the value of expensive animal tagging data. We developed an interactive web application to help scientists understand patterns in their own tagging datasets and to help scientists, funders and agencies communicate tagging data to decision-makers and to the general public. Interactive visualizations have emerged recently as a valuable tool for identifying patterns in complex datasets that are typical of ecological tagging studies. To make it easier and faster for users to gain access to interactive movement visualizations, we developed the algorithms and web-based software platform to allow users to upload their own data into a data visualization showing dynamic movement of...
thumbnail
Computational models are important tools that aid process understanding, hypothesis testing, and data interpretation. The ability to easily couple models from various domains such as, surface-water and groundwater, to form integrated models will aid studies in water resources. This project investigates the use of the Community Surface Dynamics Modeling System (CSDMS) Modeling Framework (CMF) to couple existing USGS hydrologic models into integrated models. The CMF provides a Basic Model Interface (BMI), in a range of common computer languages, that enables model coupling. In addition, the CMF also provides a Python wrapper for any model that adopts the BMI. In this project the Precipitation-Runoff Modeling...
thumbnail
This project team developed a Web-hosted application (that can also be used on mobile platforms) for automatic analysis of images of sediment for grain-size distribution, using the “Digital Grain Size” (DGS) algorithm of Buscombe (2013) (“DGS-Online,” 2015). This is a free, browser-based application for accurately estimating the grain-size distribution of sediment in digital images without any manual intervention or even calibration. It uses the statistical algorithm of Buscombe (2013) that estimates particle size directly from the spatial distribution of light intensity within the image. The application is designed to batch-process tens to thousands of images, utilizing cloud computing storage and processing technologies....
thumbnail
The cloud offers new and exciting opportunities for USGS employees to leverage computing resources and services that can quickly improve their workflows and reduce expenditures typically associated with establishing a comparable environment with physical infrastructure. However, due to the novelty of access to and use of the cloud environment, there is limited documentation and shared examples detailing how those resources and services are being used across the USGS. Developing a platform that allows cloud users to contribute to the available documentation and provides a location to consolidate information relevant to operating in the USGS cloud will help to decrease duplication of efforts across projects that share...
thumbnail
Rangeland systems are some of our nation’s largest providers of agro-ecological services, sustaining plant productivity that is highly variable across seasons and years. Although the ability to predict the upcoming growing season’s rangeland productivity would have enormous economic and management value – such as for making decisions about cattle stocking rates, fire, restoration, and wildlife – the ability to provide these forecasts has remained poor. New remote sensing and modeling technologies allow for dramatic improvements to near-term forecasts of rangeland productivity. With this project, our multi-disciplinary team has shown that, compared with traditional remote sensing greenness indices, NIRv-based (NIR...
thumbnail
The purpose of this project was to document processes for USGS scientists to organize and share data using ScienceBase, and to provide an example interactive mapping application to display those data. Data and maps from Chase and others (2016a, b) were used for the example interactive maps. Principal Investigator : Katherine J Chase, Andy Bock, Thomas R Sando Accomplishments The accomplishments for this project are described below. The project team developed an interactive mapping application in R that connects to data on ScienceBase, using Shiny, Leaflet (Cheng and Xie, 2016), and sbtools (Winslow and others, 2016) (fig. 10). USGS scientists can refer to the R code in the mapping application to build their...
The USGS National Land Cover Trends Project has the largest repository of field photos at the USGS (over 33,000 photos). Prior to CDI funding, Land Cover Trends had limited funding to make the national collection of photos available online for researchers, land managers, and citizens. The goal of this CDI project was to add geotags and keywords to the digital copies of each field photo and make the collection searchable and downloadable via the Internet. By funding the effort to integrate Land Cover Trends field photography and online mapping technology, CDI has helped provide access to geographic data needed to conduct science and support policy decisions. Sharing georeferenced photography distributed across the...
thumbnail
Large online data catalogs use controlled vocabularies to categorize datasets in ways that allow end users to sort and select data matching their needs. The eventual goal of this project is to build functional services so that the USGS Thesaurus and other USGS-controlled vocabularies will be available to the English-speaking scientific community, especially within the USGS where they can be used to improve metadata quality and data discovery. The project team used the Tetherless World Constellation (TWC) Semantic Web Methodology, which is designed to examine use cases and determine both functional and nonfunctional system requirements without prejudicial commitments to meeting those requirements by utilizing particular...
thumbnail
Legacy data (n) - Information stored in an old or obsolete format or computer system that is, therefore, difficult to access or process. (Business Dictionary, 2016) For over 135 years, the U.S. Geological Survey has collected diverse information about the natural world and how it interacts with society. Much of this legacy information is one-of-a-kind and in danger of being lost forever through decay of materials, obsolete technology, or staff changes. Several laws and orders require federal agencies to preserve and provide the public access to federally collected scientific information. The information is to be archived in a manner that allows others to examine the materials for new information or interpretations....
thumbnail
The goal of this project is to improve the USGS National Earthquake Information Center’s (NEIC) earthquake detection capabilities through direct integration of crowd-sourced earthquake detections with traditional, instrument-based seismic processing. During the past 6 years, the NEIC has run a crowd-sourced system, called Tweet Earthquake Dispatch (TED), which rapidly detects earthquakes worldwide using data solely mined from Twitter messages, known as “tweets.” The extensive spatial coverage and near instantaneous distribution of the tweets enable rapid detection of earthquakes often before seismic data are available in sparsely instrumented areas around the world. Although impressive for its speed, the tweet-based...
thumbnail
The purpose of the Data Management Training (DMT) Clearinghouse project was twofold. First, the project aimed to increase discoverability and accessibility of the wealth of learning resources that have been developed to inform and train scientists about data management in the Earth sciences. Secondly, the project team wanted to facilitate the use of these learning resources by providing descriptive information (metadata) that can help research scientists, students, or teachers assess whether the resource would be appropriate and useful for their needs. The project team established the following objectives for the project: Create an online, searchable, and browsable clearinghouse of learning resources on data...
thumbnail
The Nonindigenous Aquatic Species (NAS) Database and Alert System (https://nas.er.usgs.gov/default.aspx) provides a framework for the rapid dissemination of new invasions as they are incorporated into the NAS Database. The system notifies registered users of new sightings of >1,330 non-native aquatic species as part of national-scale early detection and rapid response systems (EDRR), and in support of several federal programs: National Invasive Species Council, Aquatic Nuisance Species Task Force, and other Department of the Interior agencies. The NAS group has developed a new tool, the Alert Risk Mapper (ARM; https://nas.er.usgs.gov/AlertSystem/default.aspx), to characterize river reaches, lakes, and other waterbodies...
thumbnail
The purpose of this project was to test and develop first-generation biological data integration and retrieval capabilities for the Water Quality Portal (National Water Quality Monitoring Council, [n.d.]) using the Water Quality Exchange (WQX) data exchange standard (Environmental Information eXchange Network, 2016). The Water Quality Portal (Portal) is a significant national water data distribution node that is aligned with the vision of the Open Water Data Initiative (Advisory Committee on Water Information, [n.d.]). The Portal is sponsored by the USGS, the EPA, and the National Water Quality Monitoring Council. The WQX data exchange standard is a mature standard widely adopted within the water quality monitoring...
The Center for Biological Informatics (CBI, now reorganized into the Core Science Analytics and Synthesis unit of the Core Science Systems Mission Area) and the Fort Collins Science Center Web Applications Team joined resources to integrate their respective Metadata Tool and ScienceBase system: Incorporated the CBI-developed Metadata Tool into ScienceBase metadata workflow. This work involved diagraming how several independently developed systems can work together to manage data in the USGS, and make it available to partners and the public. See the illustration of the USGS data management workflow from the 2011 CDI Annual Report (link in Resource section below). Designed and built a rich user interface client that...
As one of the cornerstones of the U.S. Geological Survey's (USGS) National Geospatial Program, The National Map is a collaborative effort among the USGS and other Federal, State, and local partners to improve and deliver topographic information for the Nation. It has many uses ranging from recreation to scientific analysis to emergency response. The National Map is easily accessible for display on the Web, as products and services, and as downloadable data. (Description from The National Map website, http://nationalmap.gov/about.html) In fiscal year 2010, the Community for Data Integration (CDI) funded the development of web services for the National Hydrography Dataset (NHD), the National Elevation Dataset (NED)...
The sustainability of coastal water resources is being affected by climate change, sea level rise, and modifications to land use and hydrologic systems. To prepare for and respond to these drivers of hydrologic change, coastal water managers need real-time data, an understanding of temporal trends, and information about how current and historical data compare. Coastal water managers often must make decisions based on information pieced together from multiple sources because the available data and tools are scattered across various databases and websites; to aid coastal water managers, a website that consolidates data from multiple organizations and provides statistical analysis of hydrologic and water quality data...


map background search result map search result map Developing a USGS Legacy Data Inventory to Preserve and Release Historical USGS Data Developing a USGS Legacy Data Inventory to Preserve and Release Historical USGS Data