Sponsor: National Science Foundation
Description: The objective of this project is to construct a service that will allow for past and present un-curated data to be utilized by science while simultaneously demonstrating the novel science that can be conducted from such data. The proposed effort will focus on the large distributed and heterogeneous bodies of past and present un-curated data, what is often referred to in the scientific community as long-tail data, data that would have great value to science if its contents were readily accessible. The proposed framework will be made up of two re-purposable cyberinfrastructure building blocks referred to as a Data Access Proxy (DAP) and Data Tilling Service (DTS). There are three scientific use cases are being supported by Brown Dog. We are one of the use case. Our work involves developing novel green infrastructure design criteria and models that integrate requirements for storm water management and ecosystem and human health and wellbeing. The objective of this use case will be to leverage the improved access to data from Brown Dog to develop and integrate models that enable the development of such design criteria. Data will be generated on citizen reactions to green infrastructure design across a range of socio-economic and hydroclimate settings to calibrate and validate model predictions. Myriad data from Federal, state, county, local, and non-governmental organizations, including soil types, land use, land cover, sewer infrastructure, groundwater depth, rainfall, evapotranspiration rates, etc. using novel integration approaches at both local and watershed spatial scales. The data will be assimilated into hydrologic models to improve forecasting of hydrologic and ecologic impacts and enable integrated design for multiple criteria.
Sponsor: National Science Foundation
Description: This project develops a novel computational Green Infrastructure (GI) design framework that integrates interactive, neighborhood-scale, collaborative design by multiple stakeholders ("crowd-sourced" design) with multi-scale models of ecosystem and human impacts. The following research questions are being addressed: (1) How well does coupling of site-scale ecohydrology with catchment-scale hydraulic routing improve predictions? (2) How well can stakeholder preferences be predicted using design image feature extraction and machine learning? (3) What interactive optimization and visualization techniques lead to the most rapid and complete consensus among diverse stakeholders? (4) Do stakeholders using interactive cyberinfrastructure tools consider more options and explore more of the GI design space? A "crowd-sourced" design framework is developed to enable stakeholders to interactively create and evaluate potential GI designs that reflect consideration of the full breadth of social, economic, and environmental criteria. The research questions are evaluated in diverse neighborhoods within three urban catchments in the Baltimore Ecosystem Study, working closely with environmental non-governmental organizations to ensure that the results will provide significant benefits to community stakeholders. The models developed in this project are the first to integrate criteria for human and ecosystem wellbeing with site- and watershed-scale hydrologic processes, a key advance for improving understanding and implementation of GI design. Map and image visualization identifies which visualization approaches best support improving stakeholder engagement for achieving consensus using interactive collaborative design. This makes technical advances in interactive optimization and model parameterization accessible to the broad range of stakeholders, from regulators and planners to contractors and homeowners.
Sponsor: Microsoft Research, Inc
Description: Floods and droughts cause approximately $12 billion in damages annually in the U.S. alone, the highest among all disasters. While the National Oceanic and Atmospheric Association (NOAA) National Weather Service provides flood and drought warnings across the nation, such operational systems are currently not capable of forecasting with the accuracy and real-time awareness needed to support effective disaster planning and response. With climate change causing more frequent extreme weather events, the need for improved real-time forecasting and decision support is becoming more and more pressing. This project focuses on real-time streamflow forecasting and decision support, key capabilities for predicting and managing floods and droughts in the 3.5 million miles of rivers in the U.S., as well as rivers across the globe. Using a novel multi-scale approach, we are demonstrating the value of nationwide real-time river modeling services that monitor conditions in the nation's rivers and spawn finer-scale modeling and decision support in drought- or flood-stricken areas. The project will leverage the rapidly expanding real-time national data and model services provided by NOAA and the United States Geological Survey (USGS), while laying the foundation for providing global information using global water data services that are currently under development at ESRI (www.esri.com) and Kisters (www.kisters.net).
Sponsor: Metropolitan Water Reclamation District of Greater Chicago
Description: Reductions in combined sewer overflow (CSO) occurrences and volumes are a priority for managers of urban combined sewers, given the resultant reductions in flooding, water quality problems, and financial costs. In order to reduce the volumes and occurrences of combined sewer overflow (CSO), decision support tools that allow managers/operators to maximize the efficiency of the system in real time must be developed. This project integrates and investigates data from the combined sewer system of Chicago, Illinois, as a case study for improving understanding of the causes of overflow events. The main tasks associated with this project were:
(1) The development and delivery of a Web-based tool that enables automated creation of storm-scale animations of CSO occurrences. This tool will be built based on technologies from the current Cyberinfrastructure research and development work at NCSA, especially a workflow system called Data Wolf with Publishable Active Workflows (PAW). Data Wolf and PAW enable analysts to easily create and publish (as Web services) multiple steps to access, analyze, model, and visualize data. The tool will allow for the creation of animations that will include sluice gate positions on the TARP dropshafts, WWRP inflow, TARP fill level, waterways dissolved oxygen (DO) and streamflow data (as available) in addition to the weather radar and CSO event data. The storm to be animated can be selected from a drop down list of storms events, initially from 2005-10, although additional years can easily be added later.
(2) A statistical spatio-temporal analysis of casual factors of CSOs. The combined sewer system is observed at the station and system levels and individual storm and observation period scales to find the factors both correlated with one another and associated with the managers' lowest-desirability outcomes. Factors analyzed include a range of storm intensities, location in the combined sewer system, long duration CSO events, and CSO events affiliated with open sluice gates. Patterns identified from this analysis will then used to develop a spatiotemporal probabilistic model that is able to forecast the occurrence of combine sewer overflows (CSO) at any station in the system in real time. The model will be integrated into a web-based tool (web dashboard) using the National Center for Super Computer Applications (NCSA)'s DataWolf (a tool for publishing active workflows), and is intended to provide real-time information to the operators and engineers of the sewer system. The successful implementation of this model would support the real-time operations of the sewer system and could translate into better planning and management decisions for the system by enabling the identification of system components that could be changed to reduce CSO costs, occurrences, and severity.
Sponsor: Illinois-Indiana Seagrant
Description: The water chemistry data in Great Lakes are important to understand the ecosystems of Great Lakes. Multiple monitoring programmings are generating different types of data and how to explore and utilize these newly sampled/historical data become more and more essential for better management of Great Lakes. This project involves data analysis from three kinds of sensors (1) an towed undulating package called TRIAXUS, sampling at three river mouths in Lake Michigan; (2) A depth profiling sensor called Seabird, which generates depth profile of water chemistry across all Great Lakes every year and (3) Dissolved oxygen loggers across Lake Erie measure lake bottom dissolved oxygen for months. Based on the data sets, we are aiming to provides some adaptive sampling strategies to improve the sampling efficiency and explore interesting patterns such as lake stratification, deep chlorophyll layers and hypoxia. Geostatistics, signal processing and other data mining/machine learning methods are implemented to analyze these spatial and/or temporal real world noisy sensor data.
Sponsor: Office of Naval Research through Technology, Research, Education, and Commercialization Center
Description: This project will develop and test a new collaborative group decision support system, called a cybercollaboratory, for improving real-time management of environmental hazards. For sensor and information systems to transform the status quo, complex multidisciplinary questions must be addressed by teams of experts capable of working together to develop optimum, complete, and effective system solutions. In this project, prototype cybercollaboratory software is being created and implemented within the OptIPuter Collaboratory Testbed. The OptIPuter Collaboratory will then be used to test the software and obtain feedback from users on its capabilities for supporting real-time environmental hazard management within two decision support scenarios.