Skip to main content
Advanced Search

Filters: Tags: Source Code (X)

36 results (111ms)   

Filters
Date Range
Extensions
Types
Contacts
Categories
Tag Types
Tag Schemes
View Results as: JSON ATOM CSV
thumbnail
This project team developed a Web-hosted application (that can also be used on mobile platforms) for automatic analysis of images of sediment for grain-size distribution, using the “Digital Grain Size” (DGS) algorithm of Buscombe (2013) (“DGS-Online,” 2015). This is a free, browser-based application for accurately estimating the grain-size distribution of sediment in digital images without any manual intervention or even calibration. It uses the statistical algorithm of Buscombe (2013) that estimates particle size directly from the spatial distribution of light intensity within the image. The application is designed to batch-process tens to thousands of images, utilizing cloud computing storage and processing technologies....
thumbnail
Legacy data (n) - Information stored in an old or obsolete format or computer system that is, therefore, difficult to access or process. (Business Dictionary, 2016) For over 135 years, the U.S. Geological Survey has collected diverse information about the natural world and how it interacts with society. Much of this legacy information is one-of-a-kind and in danger of being lost forever through decay of materials, obsolete technology, or staff changes. Several laws and orders require federal agencies to preserve and provide the public access to federally collected scientific information. The information is to be archived in a manner that allows others to examine the materials for new information or interpretations....
thumbnail
The goal of this project is to improve the USGS National Earthquake Information Center’s (NEIC) earthquake detection capabilities through direct integration of crowd-sourced earthquake detections with traditional, instrument-based seismic processing. During the past 6 years, the NEIC has run a crowd-sourced system, called Tweet Earthquake Dispatch (TED), which rapidly detects earthquakes worldwide using data solely mined from Twitter messages, known as “tweets.” The extensive spatial coverage and near instantaneous distribution of the tweets enable rapid detection of earthquakes often before seismic data are available in sparsely instrumented areas around the world. Although impressive for its speed, the tweet-based...
The sustainability of coastal water resources is being affected by climate change, sea level rise, and modifications to land use and hydrologic systems. To prepare for and respond to these drivers of hydrologic change, coastal water managers need real-time data, an understanding of temporal trends, and information about how current and historical data compare. Coastal water managers often must make decisions based on information pieced together from multiple sources because the available data and tools are scattered across various databases and websites; to aid coastal water managers, a website that consolidates data from multiple organizations and provides statistical analysis of hydrologic and water quality data...
thumbnail
USGS research in the Western Geographic Science Center has produced several geospatial datasets estimating the time required to evacuate on foot from a Cascadia subduction zone earthquake-generated tsunami in the U.S. Pacific Northwest. These data, created as a result of research performed under the Risk and Vulnerability to Natural Hazards project, are useful for emergency managers and community planners but are not in the best format to serve their needs. This project explored options for formatting and publishing the data for consumption by external partner agencies and the general public. The project team chose ScienceBase as the publishing platform, both for its ability to convert spatial data into web services...
thumbnail
People in the locality of earthquakes are publishing anecdotal information about the shaking within seconds of their occurrences via social network technologies, such as Twitter. In contrast, depending on the size and location of the earthquake, scientific alerts can take between two to twenty minutes to publish. The goals of this project are to assess earthquake damage and effects information, as impacts unfold, by leveraging expeditious, free and ubiquitous social-media data to enhance our response to earthquake damage and effects. Principal Investigator : Michelle Guy, Paul S Earle Cooperator/Partner : Scott R Horvath, Douglas Bausch, Gregory M Smoczyk The project leverages an existing system that performs...
thumbnail
Increasing attention is being paid to the importance of proper scientific data management and implementing processes that ensure that products being released are properly documented. USGS policies have been established to properly document not only publications, but also the related data and software. This relatively recent expansion of documentation requirements for data and software may present a daunting challenge for many USGS scientists whose major focus is their physical science and who have less expertise in information science. As a proof of concept, this project has created a software solution that facilitates this process through a user-friendly, but comprehensive, interface embedded in an existing...
We aim to migrate our research workflow from a closed system to an open framework, increasing flexibility and transparency in our science and accessibility of our data. Our hyperspectral data of agricultural crops are crucial for training/ validating machine learning algorithms to study food security, land use, etc. Generating such data is resource-intensive and requires expertise, proprietary software, and specific hardware. We will use CHS resources on their Pangeo JupyterHub to recast our data and workflows to a cloud agnostic open-source framework. Lessons learned will be shared at workshops, in reports, and on our website so others can increase the openness and accessibility of their data and workflows....
thumbnail
Digital Elevation Models (DEM) provide details of the earth’s surface and are used for visualization, physical modeling, and elevation change analysis. Creating DEMs in coastal environments is complicated by the highly ephemeral nature of the coast and the need to span the land-water interface. This requires merging multiple bathymetric and topographic datasets that have been collected at different times, using different instrument platforms with varying levels of accuracy, and with variable spatial resolution and coverage. Because coastal change can occur over relatively short time scales (days to weeks in the case of storms), rapid updates to coastal DEMs are also needed. These challenges and the lack of available...
thumbnail
Increasingly, USGS scientists seek to share and collaborate while working on data and code. Furthermore, these scientists often require advanced computing resources. Jupyter Notebooks are one such tool for creating these workflows. The files are interactive, code “notebooks” which allow users to combine code and text in one document, enabling scientists to share the stories held within their data. Recently, USGS launched an instance of Pangeo—a community platform for Big Data geoscience—as a tool for internally hosting and executing these notebooks. Few examples exist on how to use Pangeo and no formal documentation exists for USGS scientists to use Pangeo. We will create and curate examples of using Jupyter Notebooks...
thumbnail
Metadata Wizard THIS VERSION OF THE TOOL HAS BEEN REPLACED BY AN UPDATED VERSION Users should obtain the new version of the Metadata Wizard at the links below: The user manual is available here: https://usgs.github.io/fort-pymdwizard/index.html Software installer can be downloaded here: https://github.com/usgs/fort-pymdwizard/releases This tool will eventually replace the Metadata Wizard hosted from this page to eliminate dependencies on ESRI ArcDesktop and to enable Mac users to utilize the Metadata Wizard. Documentation and Previous Release Notes for the Legacy Publication and Product are Below ------------------------------ Metadata Wizard version: 1.8.5 (Last updated: 1/21/20) To download this toolbox...
thumbnail
In the mid-1800s, tile-drains were installed in poorly-drained soils of topographic lows as water management to protect cropland during wet conditions; consequently, estimations of tile-drain location have been based on soil series. Most tile drains are in the Midwest, however each state has farms with tile and tile-drain density has increased in the last decade. Where tile drains quickly remove water from fields, groundwater and stream water interaction can change, affecting water availability and flooding. Nutrients and sediment can quickly travel to streams thru tile, contributing to harmful algal blooms and hypoxia in large water bodies. Tile drains are below the soil surface, about 1 m deep, but their location...
thumbnail
Global climate models are a key source of climate information and produce large amounts of spatially explicit data for various physical parameters. However, these projections have substantial uncertainties associated with them, and the datasets themselves can be difficult to work with. The project team created the first version (cst 0.1.0) of the Climate Futures Toolbox, an open source workflow in R that allows users to access downscaled climate projections data, clip data by spatial boundaries (shapefile), save the output, and generate summary tables and plots. A detailed R vignette guides users to easily generate derived variables in order to answer specific questions about their region of interest (e.g. how will...
Geographic Information System (GIS) analyses are an essential part of natural resource management and research. Calculating and summarizing data within intersecting GIS layers is common practice for analysts and researchers. However, the various tools and steps required to complete this process are slow and tedious, requiring many tools iterating over hundreds, or even thousands of datasets. We propose to combine a series of ArcGIS geoprocessing capabilities with custom scripts to create tools that will calculate, summarize, and organize large amounts of data that can span many temporal and spatial scales with minimal user input. The tools work with polygons, lines, points, and rasters to calculate relevant summary...
thumbnail
USGS research for the Risk and Vulnerability to Natural Hazards project at the Western Geographic Science Center has produced several geospatial datasets estimating the time required to evacuate on foot from two tsunami evacuation zones (standard and extreme) traveling at three travel speeds (impaired, slow, and fast walking speeds) for the Island of O’ahu, HI. Tabulation of O’ahu resident and employee counts by region, community, and the estimated travel speed necessary to reach safety within 15 minutes serves as the final dataset for conclusions. These data are useful for emergency managers and community planners to plan for tsunami evacuations, but are often difficult to serve using traditional static maps and...
Fire has increased dramatically across the western U.S. and these increases are expected to continue. With this reality, it is critical that we improve our ability to forecast the timing, extent, and intensity of fire to provide resource managers and policy makers the information needed for effective decisions. For example, an advanced, spatially-explicit prediction of the upcoming fire season would support the planning and prioritization of fire-fighting crews, the placement and abundance of fire breaks, and the amount and type of seed needed for post-fire restoration. While the Southwest has seen exceptional increases in fire, these drier ecosystems are also notably difficult for fire predictions because of unique...


map background search result map search result map Developing a USGS Legacy Data Inventory to Preserve and Release Historical USGS Data Open-Source and Open-Workflow Climate Futures Toolbox for Adaptation Planning Developing a USGS Legacy Data Inventory to Preserve and Release Historical USGS Data Open-Source and Open-Workflow Climate Futures Toolbox for Adaptation Planning