Kenny Gruchalla

Kenny Gruchalla, Ph.D. Research

Senior Scientist, National Renewable Energy Laboratory
Assistant Professor Adjunct, University of Colorado at Boulder
National Renewable Energy Laboratory
15013 Denver West Parkway Golden, CO 80401-3305

Curriculum Vitae (pdf updated 5/18/17)

I'm primarily interested in developing interactive scientific visualization techniques that provide tools for finding meaning in increasingly large and complex data. My research interests include: scientific visualization, immersive visualization, high-performance scientific computing, GPU computing, topology-based feature extraction, human-computer interaction, and physics-based modeling.

Notable Video

LiDAR
Visualization of a Simulated LiDAR-Based Wind Turbine Wake Measurement Campaign
S. Witter Hicks, M. Churchfield, and K. Gruchalla.. SuperComputing 2016 (SC16), 2016.

Wind-plant operators would like to better control wind turbines to mitigate the wake effects between turbines, as the wakes that form behind upstream wind turbines can have significant impacts on the performance of downstream turbines. However, in order to make sound adjustments, operators need data charting the relationship between the degree of the adjustment and the resulting wake deflection. Light Detection and Ranging (LiDAR) technology, which can be programmed to measure atmospheric velocity, may be able to generate enough data to make such adjustments feasible in real-time. However, LiDAR only provides a low spatial and temporal fidelity measurement, and how accurately that measurement represents a turbine wake has not been established. To better understand the efficacy of using LiDAR to measure the wake trailing a wind turbine, we have used high-fidelity computational fluid dynamics (CFD) to simulate a wind turbine in turbulent flow, and simulate LiDAR measurements in this high-fidelity flow. A visual analysis of the LiDAR measurement in the context of the high-fidelity wake clearly illustrates the limitations of the LiDAR resolution and contributes to the overall comprehension of LiDAR operation.

NREL
The Insight Center at NREL
K. Gruchalla, and N. Brunhart-Lupo. SuperComputing 2014 (SC14), 2014.
*2013 NREL President's Award*

The Insight Center at the National Renewable Energy Laboratory (NREL) combines state-of-the-art visualization and collaboration tools to promote knowledge discovery in energy systems integration. Located adjacent to NREL’s high-performance computing data center, the Insight Center uses advanced visualization technology to provide on-site and remote viewing of experimental data, high-resolution visual imagery and large-scale simulation data.

Computational
Computational Modeling of Turbine-Wake Effects
K. Gruchalla, M.J. Churchfield, P.J. Moriarty, S. Lee, S. Li, J.K. Lundquist, J. Michalakes, A. Purkayastha, M.A. Sprague.. SciDAC 2011, 2011.
*2011 DOE OASCR Award -- People's Choice*

As the United States moves toward utilizing more of its wind and water resources for electrical power generation, computational modeling will play an increasingly important role in improving the performance, decreasing costs, and accelerating deployment of wind and water power technologies. We are developing computational models to better understand the wake effects of wind and marine hydrokinetic turbines, which operate on the same principles. Large wind plants are consistently found to perform below expectations. Inadequate accounting various turbulent-wake effects is believed to be in part responsible for the under-performance.

Particle
Particle Dynamics in a Fluidized Bed Reactor
K. Gruchalla, P. Pepiot, O. Desjardins. SciDAC 2010, 2010.
*2010 DOE OASCR Award -- Outstanding Achievement in Scientific Visualization*

Fluidized bed reactors are a promising technology for the thermo-chemical conversion of biomass in biofuel production. However, the current understanding of the behavior of the materials in a fluidized bed is limited. We are using high-fidelity simulations to better understand the mechanics of the conversion processes. This video visualizes a simulation of a periodic bed of sand fluidized by a gas stream injected at the bottom. A Lagrangian approach describes the solid phase, in which particles with a soft-sphere collision model are individually tracked along the reactor. A large-scale point-particle direct numerical simulation involving 12 millions particles and 4 million mesh points has been performed, requiring 512 cores for 4 days. The onset of fluidization is characterized by the formation of several large bubbles that rise and burst at the surface and is followed by a pseudo-steady turbulent motion showing intense bubble activity.

Numerical
Numerical Simulation of a Turbulent Atomizing Liquid Jet
K. Gruchalla, O. Desjardins, P. Pepiot, A. Purkayastha. SuperComputing 2010 (SC10), 2010.

A detailed numerical simulation of a turbulent liquid jet. Atomization of liquid fuel is the process by which a coherent liquid flow disintegrates in droplets. Understanding atomization will have far reaching repercussions on many aspects of the combustion process. This was a large-scale scaling study run on Red Mesa, , involving 1.36 billion cells, run on 12,228 cores, rendered on 1,248.


Publications

Predicted
Prediction and characterization of application power use in a high-performance computing environment
B. Bugbee, C. Phillips, H. Egan, R. Elmore, K. Gruchalla, and A. Purkayastha. Statistical Analysis and Data Mining: The ASA Data Science Journal, February 2017.

Power use in data centers and high-performance computing (HPC) facilities has grown in tandem with increases in the size and number of these facilities. Substantial innovation is needed to enable meaningful reduction in energy footprints in leadership-class HPC systems. In this paper, we focus on characterizing and investigating application-level power usage. We demonstrate potential methods for predicting power usage based on a priori and in situ characteristics. Finally, we highlight a potential use case of this method through a simulated power-aware scheduler using historical jobs from a real scientific HPC system.

surface
Ab Initio Surface Phase Diagrams for Coadsorption of Aromatics and Hydrogen on the Pt(111) Surface.
G.A. Ferguson, V. Vorotnikov, N. Wunder, J. Clark, K. Gruchalla, T. Bartholomew, D.J. Robichaud, and G.T. Beckham. The Journal of Physical Chemistry C, November 2016.

Supported metal catalysts are commonly used for the hydrogenation and deoxygenation of biomass-derived aromatic compounds in catalytic fast pyrolysis. To date, the substrate–adsorbate interactions under reaction conditions crucial to these processes remain poorly understood, yet understanding this is critical to constructing detailed mechanistic models of the reactions important to catalytic fast pyrolysis. Density functional theory (DFT) has been used in identifying mechanistic details, but many of these works assume surface models that are not representative of realistic conditions, for example, under which the surface is covered with some concentration of hydrogen and aromatic compounds. In this study, we investigate hydrogen-guaiacol coadsorption on Pt(111) using van der Waals-corrected DFT and ab initio thermodynamics over a range of temperatures and pressures relevant to bio-oil upgrading. We find that relative coverage of hydrogen and guaiacol is strongly dependent on the temperature and pressure of the system. Under conditions relevant to ex situ catalytic fast pyrolysis (CFP; 620–730 K, 1–10 bar), guaiacol and hydrogen chemisorb to the surface with a submonolayer hydrogen (∼0.44 ML H), while under conditions relevant to hydrotreating (470–580 K, 10–200 bar), the surface exhibits a full-monolayer hydrogen coverage with guaiacol physisorbed to the surface. These results correlate with experimentally observed selectivities, which show ring saturation to methoxycyclohexanol at hydrotreating conditions and deoxygenation to phenol at CFP-relevant conditions. Additionally, the vibrational energy of the adsorbates on the surface significantly contributes to surface energy at higher coverage. Ignoring this contribution results in not only quantitatively, but also qualitatively incorrect interpretation of coadsorption, shifting the phase boundaries by more than 200 K and ∼10–20 bar and predicting no guaiacol adsorption under CFP and hydrotreating conditions. The implications of this work are discussed in the context of modeling hydrogenation and deoxygenation reactions on Pt(111), and we find that only the models representative of equilibrium surface coverage can capture the hydrogenation kinetics correctly. Last, as a major outcome of this work, we introduce a freely available web-based tool, dubbed the Surface Phase Explorer (SPE), which allows researchers to conveniently determine surface composition for any one- or two-component system at thermodynamic equilibrium over a wide range of temperatures and pressures on any crystalline surface using standard DFT output.

ERGIS
Visualization of the Eastern Renewable Generation Integration Study
K. Gruchalla, J. Novacheck, A. Bloom.. SuperComputing 2016 (SC16), November 2016.

The Eastern Renewable Generation Integration Study (ERGIS), explores the operational impacts of the widespread adoption of wind and solar photovoltaic (PV) resources in the U.S. Eastern Interconnection and Que ́bec Interconnection (collectively, EI). In order to understand some of the economic and reliability challenges of managing hundreds of gigawatts of wind and PV generation, we developed state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NRELs high performance computing capabilities and new methodologies to model operations, we found that the EI could balance the variability and uncertainty of high penetrations of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions.

Battery
Interpretation of Simultaneous Mechanical-Electrical-Thermal Failure in a Lithium-Ion Battery Module
C. Zhang, S. Santhanagopalan, M.J. Stock, N. Brunhart-Lupo, K. Gruchalla. SuperComputing 2016 (SC16), November 2016.

Lithium-ion batteries are currently the state-of-the-art power sources for electric vehicles, and their safety behavior when subjected to abuse, such as a mechanical impact, is of critical concern. A coupled mechanical-electrical-thermal model for simulating the behavior of a lithium-ion battery under a mechanical crush has been developed. We present a series of production-quality visualizations to illustrate the complex mechanical and electrical interactions in this model.

PV
Feeder Voltage Regulation with High-Penetration PV Using Advanced Inverters and a Distribution Management System: A Duke Energy Case Study
B. Palmintier, J. Giraldez, K. Gruchalla, P. Gotseff, A. Nagarajan, T. Harris, B. Bugbee, M. Baggu, J. Gantz, and E. Boardman. NREL Technical Report NREL/TP-5D00-65551, November 2016.

Duke Energy, Alstom Grid (now GE Grid Solutions), and the National Renewable Energy Laboratory (NREL) collaborated to better understand advanced inverter and distribution management system (DMS) control options for large (1–5 MW) distributed solar photovoltaics (PV) and their impacts on distribution system operations. The specific goal of the project was to compare the operational—specifically, voltage regulation—impacts of three methods of managing voltage variations resulting from such PV systems: active power, local autonomous inverter control, and integrated volt/VAR control. The project found that all tested configurations of DMS-controlled IVVC provided improved performance and provided operational cost savings compared to the baseline and local control modes. Specifically, IVVC combined with PV at a 0.95 PF proved the technically most effective voltage management scheme for the system studied. This configuration substantially reduced both utility regulation equipment operations and observed voltage challenges.

OpenStudio
City Scale Modeling with OpenStudio
D. Macumber, K. Gruchalla, N. Brunhart-Lupo, M. Gleason, J. Abbot-Whitley, J. Robertson, B. Polly, K. Fleming, M. Schott. ASHRAE and IBPSA-USA SimBuild 2016, August 2016.

Assessing the impact of energy efficiency technologies at a city scale is of great interest to city planners, utility companies, and policy makers. This paper describes a flexible framework which can be used to create and run city scale building energy simulations. The framework is built around the new OpenStudio City Database (CityDB). Building footprints, building height, building type, and other data can be imported into the database from public records or other sources. The OpenStudio City User Interface (CityUI) can be used to inspect and edit data in the CityDB. Unknown data can be inferred or assigned from a statistical sampling of other datasets such as the Commercial Buildings Energy Consumption Survey (CBECS) or Residential Energy Consumption Survey (RECS). Once all required data is available, OpenStudio measures are used to create starting point energy models for each building in the dataset and to model particular energy efficiency measures for each building. Together this framework allows a user to pose several scenarios such as “what if 30% of the commercial retail buildings added roof top solar” or “what if all elementary schools converted to ground source heat pumps” and then visualize the impacts at a city scale. This paper focuses on modeling existing building stock using public records; however, the framework is capable of supporting the evaluation of new construction and the use of proprietary data sources.

ERGIS
Eastern Renewable Generation Integration Study
A. Bloom, A. Townsend, D. Palchak, J. Novacheck, J. King, C. Barrows, E. Ibanez, M. O'Connell, G. Jordan, B. Roberts, C. Draxl, K. Gruchalla. NREL Technical Report NREL/TP-6A20-64472, August 2016.
*NREL 2017 Innovation & Technology Transfer Outstanding Public Information Award*

The U.S. Department of Energy commissioned the National Renewable Energy Laboratory (NREL) to answer a question: What conditions might system operators face if the Eastern Interconnection (EI), a system designed to operate reliably with fossil fueled, nuclear, and hydro generation, was transformed to one that relied on wind and solar photovoltaics (PV) to meet 30% of annual electricity demand? In this resulting study—Eastern Renewable Generation Integration Study (ERGIS)— NREL answers that question and, in doing so, gives insights on likely operational impacts of higher percentages—up to 30% on an annual energy basis with instantaneous penetrations over 50%—of combined wind and PV generation in the EI. We evaluate potential power system futures where significant portions of the existing generation fleet are retired and replaced by different portfolios of transmission, wind, PV, and natural gas generation. We explore how variable and uncertain conditions caused by wind and solar forecast errors, seasonal and diurnal patterns, weather and system operating constraints impact certain aspects of reliability and economic efficiency. Specifically, we model how the system could meet electricity demand at a 5-minute time interval by scheduling resources for known ramping events, while maintaining adequate reserves to meet random variation in supply and demand, and contingency events.

Immersive
Simulation Exploration through Immersive Parallel Planes
N. Brunhart-Lupo, B.W. Bush, K. Gruchalla, S. Smith. IEEE VR 2016 Workshop on Immersive Analytics, March 2016.

We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive “parallel-planes” visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate’s mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, each individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be “brushed” to highlight and select observations of interest: a “slider” control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users’ realization of insights about the simulation and its output.

Power-Aware
An Analysis of Application Power and Schedule Composition in a High-Performance Computing Environment
R. Elmore, K. Gruchalla, C. Phillips, A. Purkayastha, N. Wunder. NREL Technical Report NREL/TP-2C00-65392, January 2016.

As the capacity of high performance computing (HPC) systems continues to grow, small changes in energy management have the potential to produce significant energy savings. In this paper, we employ an extensive informatics system for aggregating and analyzing real-time performance and power use data to evaluate energy footprints of jobs running in an HPC data center. We look at the effects of algorithmic choices for a given job on the resulting energy footprints, and analyze application-specific power consumption, and summarize average power use in the aggregate. All of these views reveal meaningful power variance between classes of applications as well as chosen methods for a given job. Using these data, we discuss energy-aware cost-saving strategies based on reordering the HPC job schedule. Using historical job and power data, we present a hypothetical job schedule reordering that: (1) reduces the facility’s peak power draw and (2) manages power in conjunction with a large-scale photovoltaic array. Lastly, we leverage this data to understand the practical limits on predicting key power use metrics at the time of submission.

Biomass
Biomass accessibility analysis using electron tomography
J. D. Hinkle, P. N. Ciesielski, K. Gruchalla, K. R. Munch, B. S. Donohoe. Biotechnology for Biofuels, December 2015.

Substrate accessibility to catalysts has been a dominant theme in theories of biomass deconstruction. However, current methods of quantifying accessibility do not elucidate mechanisms for increased accessibility due to changes in microstructure following pretreatment. We introduce methods for characterization of surface accessibility based on fine-scale microstructure of the plant cell wall as revealed by 3D electron tomography. These methods comprise a general framework, enabling analysis of image-based cell wall architecture using a flexible model of accessibility. We analyze corn stover cell walls, both native and after undergoing dilute acid pretreatment with and without a steam explosion process, as well as AFEX pretreatment. Image-based measures provide useful information about how much pretreatments are able to increase biomass surface accessibility to a wide range of catalyst sizes. We find a strong dependence on probe size when measuring surface accessibility, with a substantial decrease in biomass surface accessibility to probe sizes above 5 nm radius compared to smaller probes.

Evaluating
Evaluating the Efficacy of Wavelet Configurations on Turbulent-Flow Data
S. Li, K. Gruchalla, K. Potter, J. Clyne, H. Childs. In Proceedings of IEEE Symposium on Large Data Analysis and Visualization, October 2015.

I/O is increasingly becoming a significant constraint for simulation codes and visualization tools on modern supercomputers. Data compression is an attractive workaround, and, in particular, wavelets provide a promising solution. However, wavelets can be applied in multiple configurations, and the variations in configuration impact accuracy, storage cost, and execution time. While the variation in these factors over wavelet configurations have been explored in image processing, they are not well understood for visualization and analysis of scientific data. To illuminate this issue, we evaluate multiple wavelet configurations on turbulent-flow data. Our approach is to repeat established analysis routines on uncompressed and lossy-compressed versions of a data set, and then quantitatively compare their outcomes. Our findings show that accuracy varies greatly based on wavelet configuration, while storage cost and execution time vary less. Overall, our study provides new insights for simulation analysts and visualization experts, who need to make tradeoffs between accuracy, storage cost, and execution time.

Segmentation
Segmentation and Visualization of Multivariate Features using Feature-Local Distributions
K. Gruchalla, M. Rast, E. Bradley, P. Mininni. In Advances in Visual Computing, Lecture Notes in Computer Science (vol. 6938), 2011.

We introduce an iterative feature-based transfer function design that extracts and systematically incorporates multivariate feature-local statistics into a texture-based volume rendering process. We argue that an interactive multivariate feature-local approach is advantageous when investigating ill-defined features, because it provides a physically meaningful, quantitatively rich environment within which to examine the sensitivity of the structure properties to the identification parameters. We demonstrate the efficacy of this approach by applying it to vortical structures in Taylor-Green turbulence. Our approach identified the existence of two distinct structure populations in these data, which cannot be isolated or distinguished via traditional transfer functions based on global distributions.

Computational
Computational Modeling of Wind-Plant Aerodynamics
M.A. Sprague, P.J. Moriarty, M.J. Churchfield, K. Gruchalla, S. Lee, J.K. Lundquist, J. Michalakes, A. Purkayastha. In Proceedings of SciDAC 2011, 2011.

As the US moves toward 20% wind power by 2030, computational modeling will play an increasingly important role in determining wind-plant siting, designing more efficient and reliable wind turbines, and understanding the interaction between large wind plants and regional weather. From a computing perspective, however, adequately resolving the relevant scales of wind-energy production is a petascale problem verging on exascale. In this paper we discuss the challenges associated with computational simulation of the multiscale wind-plant system, which includes turbine-scale turbulence, atmospheric-boundary-layer turbulence, and regional-weather variation. An overview of computational modeling approaches is presented, and our particular modeling strategy is described, which involves modification and coupling of three open-source codes—FAST, OpenFOAM, and WRF, for structure aeroelasticity, local fluid dynamics, and mesoscale fluid dynamics, respectively.

Simulation
Simulation Characterization and Optimization of Metabolic Models with the High-Performance Systems Biology Toolkit
M. Lunacek, A. Nag, D. Alber, K. Gruchalla, C.H. Chang, P.A. Graf. SIAM Journal on Scientific Computing (vol. 33), 2011.

The High-Performance Systems Biology Toolkit (HiPer SBTK) is a collection of simulation and optimization components for metabolic modeling and the means to assemble them into large parallel processing hierarchies suiting a particular simulation optimization need. The components come in a variety of different categories: model translation, model simulation, parameter sampling, sensitivity analysis, parameter estimation, and optimization. They can be configured at runtime into hierarchically parallel arrangements to perform nested combinations of simulation characterization tasks with excellent parallel scaling to thousands of processors. We describe the observations that led to the system, the components, and how one can arrange them. We show nearly 90% efficient scaling to over 13,000 processors, and we demonstrate three complex yet typical examples that have run on ∼1000 processors and accomplished billions of stiff ordinary differential equation simulations.

Integration
Integration and Dissemination of Citizen Reported and Seismically Derived Earthquake Information via Social Network Technologies
M. Guy, P. Earle, C. Ostrum, K. Gruchalla, S. Horvath. In Advances in Intelligent Data Analysis, Lecture Notes in Computer Science (vol. 6065), 2010.

People in the locality of earthquakes are publishing anecdotal information about the shaking within seconds of their occurrences via social network technologies, such as Twitter. In contrast, depending on the size and location of the earthquake, scientific alerts can take between two to twenty minutes to publish. We describe TED (Twitter Earthquake Detector) a system that adopts social network technologies to augment earthquake response products and the delivery of hazard information. The TED system analyzes data from these social networks for multiple purposes: 1) to integrate citizen reports of earthquakes with corresponding scientific reports 2) to infer the public level of interest in an earthquake for tailoring outputs disseminated via social network technologies and 3) to explore the possibility of rapid detection of a probable earthquake, within seconds of its occurrence, helping to fill the gap between the earthquake origin time and the presence of quantitative scientific data.

VAPOR:
VAPOR: Visual, Statistical, and Structural Analysis of Astrophysical Flows
J. Clyne, K. Gruchalla, M. Rast. In Proceedings of Numerical Modeling of Space Plasma Flows: Astronum-2009, 2009.

In this paper we discuss recent developments in the capabilities of VAPOR (open source, available at http://www.vapor.ucar.edu): a desktop application that leverages today’s powerful CPUs and GPUs to enable visualization and analysis of terascale data sets using only a commodity PC or laptop. We review VAPORs current capabilities, highlighting support for Adaptive Mesh Refinement (AMR) grids, and present new developments in interactive feature-based visualization and statistical analysis.

Progressive
Progressive Visualization-Driven Multivariate Feature Definition and Analysis
K. Gruchalla. Ph.D. Thesis, University of Colorado at Boulder, 2009.

One of the barriers to visualization-enabled scientific discovery is the difficulty in clearly and quantitatively articulating the meaning of a visualization, particularly in the exploration of relationships between multiple variables in large-scale data sets. This issue becomes more complicated in the visualization of three-dimensional turbulence, since geometry, topology, and statistics play complicated, intertwined roles in the definitions of the features of interest, making them difficult or impossible to precisely describe. This dissertation develops and evaluates a novel interactive multivariate volume visualization framework that allows features to be progressively isolated and defined using a combination of global and feature-local properties. I argue that a progressive and interactive multivariate feature-local approach is advantageous when investigating ill-defined features because it provides a physically meaningful, quantitatively rich environment within which to examine the sensitivity of the structure properties to the identification parameters. The efficacy of this approach is demonstrated in the analysis of vortical structures in Taylor-Green turbulence. Through this analysis, two distinct structure populations have been discovered in these data: structures with minimal and maximal local absolute helicity distributions. These populations cannot be distinguished via global distributions; however, they were readily identified by this approach, since their feature-local statistics are distinctive.

Visualization-driven
Visualization-Driven Structural and Statistical Analysis of Turbulent Flows
K. Gruchalla, M. Rast. E. Bradley, J. Clyne, P. Mininni. In Advances in Intelligent Data Analysis, Lecture Notes in Computer Science (vol. 5772), 2009.

Knowledge extraction from data volumes of ever increasing size requires ever more flexible tools to facilitate interactive query. Interactivity enables real-time hypothesis testing and scientific discovery, but can generally not be achieved without some level of data reduction. The approach described in this paper combines multi-resolution access, region-of-interest extraction, and structure identification in order to provide interactive spatial and statistical analysis of a terascale data volume. Unique aspects of our approach include the incorporation of both local and global statistics of the flow structures, and iterative refinement facilities, which combine geometry, topology, and statistics to allow the user to effectively tailor the analysis and visualization to the science. Working together, these facilities allow a user to focus the spatial scale and domain of the analysis and perform an appropriately tailored multivariate visualization of the corresponding data. All of these ideas and algorithms are instantiated in a deployed visualization and analysis tool called VAPOR, which is in routine use by scientists internationally. In data from a 1024x1024x1024 simulation of a forced turbulent flow, VAPOR allowed us to perform a visual data exploration of the flow properties at interactive speeds, leading to the discovery of novel scientific properties of the flow. This kind of intelligent, focused analysis/refinement approach will become even more important as computational science moves towards petascale applications.

Immersive
Immersive Examination of the Qualitative Structure of Biomolecules
K. Gruchalla, M. Dubin, J. Marbach, E. Bradley. In Proceedings the International Workshop on Qualitative Reasoning about Physical Systems, 2008.

We studied the added value in using immersive visualization as a molecular research tool. We present our results in the context of “embodied cognition”, as a way to understand situations in which immersive virtual visualization may be particularly useful. PYMOL, a non-immersive application used by biochemistry researchers, was ported to an immersive virtual environment (IVE) to run on a four-PC cluster. Three research groups were invited to extend their current research on a molecule of interest to include an investigation of that molecule inside the IVE. The groups each had a similar experience of visualizing a feature of their molecule they had not previously appreciated from workstation viewing; large-scale spatial features, such as pockets and ridges, were readily identified when walking around the molecule displayed at human scale. We suggest that this added value arises because an IVE affords the opportunity to visualize the molecule using normal, everyday-world perceptual abilities that have been tuned and practiced from birth. This work also suggests that short sessions of IVE viewing can valuably augment extensive, non-IVE based visualizations.

Porting
Porting Legacy Applications to Immersive Virtual Environments: A Case Study
K. Gruchalla, J. Marbach, M. Dubin. In Proceedings of the International Conference on Computer Graphics Theory and Applications, 2007.

Immersive virtual environments are becoming increasingly common, driving the need to develop new software or adapt existing software to these environments. We discuss some of the issues and limitations of porting an existing molecular graphics system, PyMOL, into an immersive virtual environment. Presenting macromolecules inside an interactive immersive virtual environment may provide unique insights into molecular structure and improve the rational design of drugs that target a specific molecule. PyMOL was successfully extended to render molecular structures immersively; however, elements of the legacy interactive design did not scale well into three-dimensions. Achieving an interactive frame rate for large macromolecules was also an issue. The immersive system was developed and evaluated on both a shared-memory parallel machine and a commodity cluster.

Immersive
Immersive Visualization of the Hurricane Isabel Dataset
K. Gruchalla, J. Marbach. IEEE Visualization 2004, 2004.
*2nd place IEEE Visualization 2004 Contest*

In this paper, we describe an immersive prototype application, AtmosV, developed to interactively visualize the large multivariate atmospheric dataset provided by the IEEE Visualization 2004 Contest committee. The visualization approach is a combination of volume and polygonal rendering. The immersive application was developed and evaluated on both a shared-memory parallel machine and a commodity cluster. Using the cluster we were able to visualize multiple variables at interactive frame rates.

Immersive
Immersive Well-Path Editing: Investigating the added value of immersion
K. Gruchalla. IEEE VR 2004, 2004.

The benefits of immersive visualization are primarily anecdotal; there have been few controlled user studies that have attempted to quantify the added value of immersion for problems requiring the manipulation of virtual objects. This research quantifies the added value of immersion for a real-world industrial problem: oil well-path planning. An experiment was designed to compare human performance between an immersive virtual environment (IVE) and a desktop workstation. This work presents the results of sixteen participants who planned the paths of four oil wells. Each participant planned two well-paths on a desktop workstation with a stereoscopic display and two well-paths in a CAVE-like IVE. Fifteen of the participants completed well-path editing tasks faster in the IVE than in the desktop environment. The increased speed was complimented by a statistically significant increase in correct solutions in the IVE. The results suggest that an IVE can allow for faster and more accurate problem solving in a complex three-dimensional domain.

Immersive
Immersive Well-Path Planning: Investigating the added value of immersion
K. Gruchalla. Master's Thesis, University of Colorado at Boulder, 2003.

The benefits of immersive visualization are primarily anecdotal; there have been few controlled users studies that have attempted to quantify the added value of immersion for problems requiring the manipulation of virtual objects. This research quantifies the added value of immersion for a real-world industrial problem: oil well path planning. An experiment was designed to compare human performance between an immersive virtual environment (IVE) and a desktop workstation with stereoscopic display. This work consisted of building a cross-environment application, capable of visualizing and editing a planned well path within an existing oilfield, and conducting an user study on that application. This work presents the results of sixteen participants who planned the paths of four oil wells. Each participant planned two well paths on a desktop workstation with a stereoscopic display and two well paths in a CAVE-like IVE. Fifteen of the participants completed well path editing tasks faster in the IVE than in the desktop environment, which is statistically significant (p < 0.001). The increased speed in the IVE was complimented by an increase correct solutions. There was a statistically significant (p < 0.05) increase in correct solutions in the IVE. The results suggest that an IVE allows for faster and more accurate problem solving in a complex interactive three-dimensional domain.