Skip to main content
Advanced Search

Filters: Tags: {"type":"CMS Topics"} (X)

Folders: ROOT > ScienceBase Catalog > Community for Data Integration (CDI) > CDI Projects Fiscal Year 2017 ( Show all descendants )

11 results (13ms)   

View Results as: JSON ATOM CSV
thumbnail
Inventories of landslides and liquefaction triggered by major earthquakes are key research tools that can be used to develop and test hazard models. To eliminate redundant effort, we created a centralized and interactive repository of ground failure inventories that currently hosts 32 inventories generated by USGS and non-USGS authors and designed a pipeline for adding more as they become available. The repository consists of (1) a ScienceBase community page where the data are available for download and (2) an accompanying web application that allows users to browse and visualize the available datasets. We anticipate that easier access to these key datasets will accelerate progress in earthquake-triggered ground...
thumbnail
ScienceCache was originally developed as a mobile device data collection application for a citizen science project. ScienceCache communicates with a centralized database that facilitates near real-time use of collected data that enhances efficiency of data collection in the field. We improved ScienceCache by creating a flexible, reliable platform that reduces effort required to set up a survey and manage incoming data. Now, ScienceCache can be easily adapted for citizen science projects as well as restricted to specific users for private internal research. We improved scEdit, a web application interface, to allow for creation of more-complex data collection forms and survey routes to support scientific studies....
thumbnail
Geotagged photographs have become a useful medium for recording, analyzing, and communicating Earth science phenomena. Despite their utility, many field photographs are not published or preserved in a spatial or accessible format—oftentimes because of confusion about photograph metadata, a lack of stability, or user customization in free photo sharing platforms. After receiving a request to release about 1,210 geotagged geological field photographs of the Grand Canyon region, we set out to publish and preserve the collection in the most robust (and expedient) manner possible (fig. 6). We leveraged and reworked existing metadata, JavaScript, and Python tools and developed a toolkit and proposed workflow to display...
thumbnail
Executive Summary Traditionally in the USGS, data is processed and analyzed on local researcher computers, then moved to centralized, remote computers for preservation and publishing (ScienceBase, Pubs Warehouse). This approach requires each researcher to have the necessary hardware and software for processing and analysis, and also to bring all external data required for the workflow over the internet to their local computer. To explore a more efficient and effective scientific workflow, we explored an alternate model: storing scientific data remotely, and performing data analysis and visualization close to the data, using only a local web browser as an interface. Although this environment was not a good fit...
thumbnail
USGS scientists often face computationally intensive tasks that require high-throughput computing capabilities. Several USGS facilities use HTCondor to run their computational pools but are not necessarily connected to the larger USGS pool. This project demonstrated how to connect HTCondor pools by flocking, or coordinating, within the USGS. In addition to flocking the Upper Midwest Environmental Science Center and the Wisconsin Water Science Center, we have flocked with the USGS Advanced Research Computing Yeti supercomputing cluster and other water science centers. We also developed tutorials on how to sandbox code using Docker within the USGS environment for use with high-throughput computing. A main accomplishment...
thumbnail
In this age of rapidly developing technology, scientific information is constantly being gathered across large spatial scales. Yet, our ability to coordinate large-scale monitoring efforts depends on development of tools that leverage and integrate multiple sources of data. North American bats are experiencing unparalleled population declines. The North American Bat Monitoring Program (NABat), a multi-national, multi-agency coordinated monitoring program, was developed to better understand the status and trends of North American bats. Similar to other large-scale monitoring programs, the ultimate success of NABat relies on a unified web-based data system. Our project successfully developed a program interface...
thumbnail
U.S. Geological Survey (USGS) scientists are at the forefront of research that is critical for decision-making, particularly through the development of models (Bayesian networks, or BNs) that forecast coastal change. The utility of these tools outside the scientific community has been limited because they rely on expensive, technical software and a moderate understanding of statistical analyses. We proposed to convert one of our models from proprietary to freely available open-source software, resulting in a portable interactive web-interface. The resulting product will serve as a prototype to demonstrate how interdisciplinary USGS science and models can be transformed into an approachable format for decision-makers....
thumbnail
The USGS 3D Elevation Program (3DEP) is managing the acquisition of lidar data across the Nation for high resolution mapping of the land surface, useful for multiple applications. Lidar data is initially collected as 3-dimensional “point clouds” that map the interaction of the airborne laser with earth surface features, including vegetation, buildings, and ground features. Generally the product of interest has been high resolution digital elevation models generated by filtering the point cloud for laser returns that come from the ground surface and removing returns from vegetation, buildings, powerlines, and other above ground features. However, there is a wealth of information in the full point cloud on vegetation...
thumbnail
A BioBlitz is a field survey method for finding and documenting as many species as possible in a specific area over a short period. The National Park Service and National Geographic Society hosted the largest BioBlitz survey ever in 2016; people in more than 120 national parks used the iNaturalist app on mobile devices to document organisms they observed. Resulting records have Global Positioning System (GPS) coordinates, include biological accuracy assessments, and provide an unprecedented snapshot of biodiversity nationwide. Additional processing and analysis would make these data available to inform conservation and management decisions. This project developed a process to integrate iNaturalist data with existing...
thumbnail
As one of the largest and oldest science organizations in the world, USGS has produced more than a century of earth science data, much of which is currently unavailable to the greater scientific community due to inaccessible or obsolescent media, formats, and technology. Tapping this vast wealth of “dark data” requires 1) a complete inventory of legacy data and 2) methods and tools to effectively evaluate, prioritize, and preserve the data with the greatest potential impact to society. Recognizing these truths and the potential value of legacy data, USGS has been investigating legacy data management and preservation since 2006, including the 2016 “DaR” project, which developed legacy data inventory and evaluation...
thumbnail
USGS research for the Risk and Vulnerability to Natural Hazards project at the Western Geographic Science Center has produced several geospatial datasets estimating the time required to evacuate on foot from two tsunami evacuation zones (standard and extreme) traveling at three travel speeds (impaired, slow, and fast walking speeds) for the Island of O’ahu, HI. Tabulation of O’ahu resident and employee counts by region, community, and the estimated travel speed necessary to reach safety within 15 minutes serves as the final dataset for conclusions. These data are useful for emergency managers and community planners to plan for tsunami evacuations, but are often difficult to serve using traditional static maps and...


    map background search result map search result map USGS Data at Risk: Expanding Legacy Data Inventory and Preservation Strategies