This project focuses on developing a community consensus plan for the functioning and needs of the CLEANER Engineering Analysis Network (EAN). The EAN would implement the CLEANER vision through a system of instrumentation, data, and computational resources, shared by geographically-distributed investigators and supported by cyberinfrastructure (a system of computers, digital data, networks, algorithms, and collaboration tools that supports geographically distributed teams of researchers and educators). This project helped define requirements for the EAN, including the cyberinfrastruture and management plans necessary to bring the CLEANER vision into fruition, using a collaborative, community-based process supported by state-of-the-art information technology.
This cooperative agreement establishes a project office to conduct planning and consortium development for this environmental observatory initiative. Key elements of the project office activities include the development of a science plan based on "grand challenge" issues that have compelling scientific merit, address critical national needs, and can be advanced only through a networked observatory-based infrastructure. Community consensus building constitutes a second major activity for the project office. This activity includes: (1) identifying and broadly engaging the engineering-science research and education communities in consensus-building activities, including development of the program plan, (2) incorporating the socio-economic community in these plans, (3) collaborating with the Consortium of Universities for the Advancement of Hydrologic Sciences, Inc. (CUAHSI) to organize a community consortium that will lead further development of the initiative, and (4) working with NSF to involve other relevant government agencies and private sector organizations as active partners in the initiative. See http://cleaner.ncsa.uiuc.edu for more information.
DOE and other Federal agencies are making a significant investment in the development of field analytical techniques, nonintrusive technologies, and sensor technologies that will have a profound impact on the way environmental monitoring is conducted. Monitoring and performance evaluation networks will likely be based on suites of in situ sensors, with physical sampling playing a much more limited role. Designing and using these types of networks effectively will require development of a new adaptive paradigm for sampling and analysis of remedial actions, which is the overall goal of this project. Specifically, the objectives of this project are to: (1) enable effective interpretation of non-intrusive monitoring data, (2) improve predictions and assessment of remediation performance, (3) develop decision rules for on-site adaptive sampling and analysis, and (4) enable more informed decision making and risk analysis of long-term monitoring systems.
This project builds on the work begun in the NSF project to develop a risk management model for groundwater corrective action design. The model will be enhanced to allow tradeoffs to be made among risk, cost, and cleanup time under conditions of uncertainty. Innovative advancements for improving computational efficiency of the model using advanced stochastic genetic algorithms, hybrid genetic algorithms, and hierarchical multi-population genetic algorithms are also being investigated.
An optimal control model for aerobic in situ bioremediation design has been developed in previous work, but the computational effort associated with solving the model is so great that field-scale, heterogeneous sites cannot be accurately modeled. In this project, multiscale optimization methods and a hybrid genetic algorithm will be developed to improve performance and capabilities of the model. The research will be integrated with education through development of: (1) graphical user interfaces to improve access to the model by students and practitioners, (2) an educational game using the research software developed in this project, and (3) a new graduate course on coupled optimization and simulation modeling to teach students the complexities associated with developing and applying such models.
Petroleum companies have substantial liabilities for cleaning sites with soil and groundwater contamination, including numerous service stations across the U.S. At each site, substantial data exists, including both technical and financial data. While these data are carefully examined at individual sites, the data have rarely been examined across sites to extract knowledge and lessons learned that can be used to improve management of future liabilities. The objective of this study will be to explore how automated knowledge discovery approaches (“data mining”) can be used to discover and share such knowledge. More specifically, geological features and management practices documented in service station reports will be mined to identify features and practices that are most likely to lead to high or low future remediation liabilities.
Given the scope of contamination of U.S. groundwater and the vast amount of money involved in corrective action, improved risk management and remediation design is a critical need. In this project,a risk management model is being developed to investigate relationships between human health risk and corrective action design under conditions of uncertainty. The methodology combines a noisy genetic algorithm, which searches for cost-effective corrective action plans, with a flow and transport model called RT3D and a human health exposure risk assessment module. Theoretical advancements for improving computational efficiency of the model will also be investigated.
Due to technical limitations and the high cost of hazardous waste site clean up, there has been a shift toward risk-based long-term management of sites, where some contamination is left in place. A recent DOE report identifies the need for managing existing restoration sites for periods of 70 years or longer. As the groundwater remediation field matures and the installation of remediation systems are completed, it is becoming clear that long-term monitoring, operation, and stewardship (LTMOS) of these systems will comprise a significant portion of future expenditures. LTMOS data collection objectives are not well defined and only a small portion of the data currently collected is used to assess the remediation progress. The University of Illinois, Argonne National Laboratory, and DHI Water and Environment are collaborating to demonstrate how integrating all available site data can improve LTMOS decision making and provide cost savings.
The initial approach will involve using an artificial neural network (ANN) to integrate historic and current data from the 317/319 Area phytoremediation site at Argonne National Lab-East (ANL-E) with an existing Modflow flow model to provide a better understanding of site processes and direct future data collection. The complex geology and spotty history of waste disposal practices make the application of transport models impractical. The unstructured nature of ANNs allows these complexities to be included in a flexible framework for decision making.
The site is currently monitored under multiple programs, each with different objectives and timelines. Development of the ANN will provide a method for integrating the diverse data sources available at the site and using that information to determine the importance of each data source in achieving monitoring objectives.
Methods for efficiently implementing a risk management model on a distributed cluster of commodity computers are being investigated. The model combines a noisy genetic algorithm, which searches for cost-effective corrective action plans, with a flow and transport model called RT3D and a human health exposure risk assessment module. Commodity computers were chosen because the ultimate users of the model will be practitioners and government regulators, who may not have access to massively parallel supercomputers. The research will investigate both single- and multiple-population approaches to genetic algorithm parallelization, exploring innovative methods that should simultaneously improve computational efficiency of the model.
During groundwater remediation, monitoring wells must be sampled to track the progress of remediation. Large, complex sites may have hundreds of monitoring wells that were installed for site characterization and long-term sampling from all of these wells can cost millions of dollars per year. The objective of this project is to develop a methodology for designing cost-effective long-term monitoring plans and to demonstrate the methodology’s capabilities by applying it at a field site. The method combines three primary components: a groundwater fate-and-transport simulation, several plume interpolation techniques, and a genetic algorithm to search for effective sampling plans.
Management of sediment contamination in water bodies is a highly complex problem, involving numerous physical, chemical, and biological processes operating in a multitude of environmental settings. To aid in developing and evaluating cost-effective strategies for sediment management, efficient and effective predictive tools are needed. The objective of this project is to identify available computer models and to evaluate which models might be most appropriate for managing potential sediment management scenarios. Once available models are evaluated, recommendations will be made regarding the need for development/modification of sediment management tools in future project(s).
Dense nonaqueous phase liquids are common groundwater contaminants which are particularly difficult to remediate. In situ bioremediation is a promising remediation technology, but the success of this approach is limited by bioavailability. The microorganisms degrade contaminants in the aqueous phase; dissolution from the nonaqueous phase and desorption from the aquifer solids occur slowly, limiting the rate of biodegradation. We are developing a preliminary fate and transport model to evaluate the effectiveness of using heat to increase desorption, dissolution, and biodegradation. This study is the first step in a larger project to perform laboratory experiments and develop more detailed models of these processes.
Risk-based corrective action is rapidly becoming the method of choice for groundwater remediation. However, considerable uncertainty is associated with both the human health response to a given dose and the prediction of exposure to contaminants. In this project, we are developing a management model which will be used for in situ bioremediation design under risk-based criteria. The effects of risk and uncertainty in the risk calculations on the appropriate design will be examined. If, for example, high risk is associated with a particular site, then the reliability of the cleanup must be very high to ensure that human health is protected. Conversely, if risk is quite low, the reliability can be lower and natural processes can be relied upon to remediate the groundwater at lower cost. A coupled optimization and simulation model using a noisy genetic algorithm and finite element simulation model is used in this work.