Table of Links
Abstract and 1 Introduction
2 Related Work
3 Methodology
4 Studying Deep Ocean Ecosystem and 4.1 Deep Ocean Research Goals
4.2 Workflow and Data
4.3 Design Challenges and User Tasks
5 The DeepSea System
- 5.1 Map View
- 5.2 Core View
5.3 Interpolation View and 5.4 Implementation
6 Usage Scenarios and 6.1 Scenario: Pre-Cruise Planning
- 6.2 Scenario: On-the-Fly Decision-Making
7 Evaluation and 7.1 Cruise Deployment
7.2 Expert Interviews
7.3 Limitations
7.4 Lessons Learned
8 Conclusions and Future Work, Acknowledgments, and References
7.4 Lessons Learned
Based on the initial deployment of DeepSee (Sect. 7.1), user feedback (Sect. 7.2) and limitations (Sect. 7.3), we synthesized guiding principles for enhancing the design of future visualization systems that aim to support fieldwork-driven research.
Prioritize data integration as a user task. Developing visualizations that support fieldwork-driven research allowed us to solve interesting constraints before and during expeditions, including limited computational resources and reconciling existing data collections with data collected during expeditions. However, even though we prioritized predictive capabilities and uncertainty visualizations in DeepSee to address these constraints, we unexpectedly discovered that not designing for data integration support as a user task limited the ability for experts to adaptably update hypotheses and make tactical decisions in the field. It was difficult to track data being added on the fly, as P5 described the challenge they encountered during deployment of translating a mental decision-making process into using DeepSee: “You might need to change your hypothesis and your methodology on the fly. These challenges often happen with the lab advisor in their head making big decisions… it was hard to incorporate DeepSee into the decision-making process, especially when it’s in the lab advisor’s head.” A mixed-initiative approach [22] could alleviate time pressure during decision-making by bringing relevant data to the foreground based users’ interaction history, bridging the gap in sensemaking between concurrent mental and physical analysis processes [41] to determine and communicate what to sample. Prioritizing data integration capabilities as a user task when designing visualization systems for fieldwork-driven research can ensure end-to-end support for tactical decisions when deploying tools out in the field.
Visualize physical data in context of the environment. DeepSee fostered new skills in communicating and conceptualizing complex phenomena using intuitive data visualization techniques. For example, interpreting 3D data in the Interpolation View was made easier by representing the data as it looks in real life, i.e., shaped as a cylinder, or with realistic proportions of height and area. This allowed P2 to think through difficult research questions: “I think about the structure of the world I am exploring… There is a lot of data with a lot of dimensions. When we have discrete data points and samples but the environment is continuous, how do we represent phenomenon? For example, sampling the seafloor, or other planets, creates discrete measurements of continuous phenomenon.” Employing direct manipulation helped P5 understand the data more intuitively: “I like being able to physically click around and associate a map with samples. Before, I had to manually input [markers] with GIS software. DeepSee automatically visualizes samples [as they are added, so] now I can click on a sample and see details on demand.” P2 felt that showing the data as it looks in real life helped illuminate patterns in the data that intuitively showed “holes in sampling that we want to fill in”. The team also felt the visualization techniques in DeepSee could extend to other disciplines, such as terrestrial field work integrating drone imagery or future autonomous sampling missions on other planets. Designing data visualizations to mirror real-life counterparts can improve people’s ability to communicate and understand complex scientific phenomena.
Combine data types in new ways to bridge analysis gaps. For team members such as P4 doing data work, DeepSee made centralizing and aligning data a priority: “DeepSee forced us to consolidate data from the same cruise and different cruises into the same format and in one location.” The desire to overlay multiple data types over different temporal ranges (i.e., region-, core-, and sample-level data in Sect. 4.2) catalyzed interest in answering new research questions that were previously time-consuming and/or difficult. For example, by engaging fluid interaction between multiple levels of data aggregation in the Map View and Core View, DeepSee enables users to make data-driven decisions based on complex geochemical and taxonomic data distributed at the region, core, and sample level over time (Sect. 6.1). This capability mirrors feedback from our expert interviews (Sect. 7.2) that a key feature of DeepSee for P3 was enabling “spatial thinking for the middle range of analysis” between the microscopic and global level. Further, combining multiple data types in a single interface enabled deep ocean researchers to maximize the scientific value on limited sampling. For example, P5 expressed benefits in their role as scientist of tracking hypotheses and data and their changes over time when filtering by cruise in the Map View: “How do we know we’re not reinventing the wheel? DeepSee has really enabled me to ask whether I’m contributing scientific work that fits in the data we have already collected…” DeepSee shows the value of designing for and around data requirements in improving the scientific return and longevity of visualization tools.
Design interactive visualizations to aid mental modeling. Interactive visualizations as a research tool can help scientists gain insight into their own workflows by seeing their problems through a different lens [9]. For example, the Map View enabled core sample data tables to be plotted “live” for building mental models of spatial ecological processes, rather than only as a “static” output for post-hoc modeling. Because of this, during the cruise deployment, P3 saw increased awareness in colleagues of the potential for using visualization tools out in the field: “I see a desire from researchers to use more and different data products live in the field. Before… we didn’t look at mapping data as much live, we didn’t look at sequence data as much on the ship during the dive.” In this way, DeepSee demonstrates potential as a generative tool to think with [21]. Manipulating physical data at different scales while continuously maintaining context with the environment between the Map View and Interpolation View aided researchers in their mental modeling of complex physical processes, such as how ecological processes might spread under the sea floor. As a burgeoning PI, P2 is excited to use DeepSee as “a resource for my students to make more informed critical decisions in this line of work. This tool fills in that gap by giving them a visualization of the field to help foster intuition.” They elaborated on the effects of using data visualizations to test ideas live: “I think using DeepSee influenced the way I teach… [having DeepSee] fundamentally changes the questions we can ask and the information that is available by changing the way we see the data.” DeepSee exemplifies interactive visualizations as an opportunity for scientists to iteratively define and refine questions through “making to know” [21, 48] and for designers to leverage the intuitive nature of visualization to demonstrate the art of possibility.
Authors:
(1) Adam Coscia, Georgia Institute of Technology, Atlanta, Georgia, USA ([email protected]);
(2) Haley M. Sapers, Division of Geological and Planetary Sciences, California Institute of Technology, Pasadena, California, USA ([email protected]);
(3) Noah Deutsch, Harvard University Cambridge, Massachusetts, USA ([email protected]);
(4) Malika Khurana, The New York Times Company, New York, New York, USA ([email protected]);
(5) John S. Magyar, Division of Geological and Planetary Sciences, California Institute of Technology Pasadena, California, USA ([email protected]);
(6) Sergio A. Parra, Division of Geological and Planetary Sciences, California Institute of Technology Pasadena, California, USA ([email protected]);
(7) Daniel R. Utter, [email protected] Division of Geological and Planetary Sciences, California Institute of Technology Pasadena, California, USA ([email protected]);
(8) John S. Magyar, Division of Geological and Planetary Sciences, California Institute of Technology Pasadena, California, USA ([email protected]);
(9) David W. Caress, Monterey Bay Aquarium Research Institute, Moss Landing, California, USA ([email protected]);
(10) Eric J. Martin Jennifer B. Paduan Monterey Bay Aquarium Research Institute, Moss Landing, California, USA ([email protected]);
(11) Jennifer B. Paduan, Monterey Bay Aquarium Research Institute, Moss Landing, California, USA ([email protected]);
(12) Maggie Hendrie, ArtCenter College of Design, Pasadena, California, USA ([email protected]);
(13) Santiago Lombeyda, California Institute of Technology, Pasadena, California, USA ([email protected]);
(14) Hillary Mushkin, California Institute of Technology, Pasadena, California, USA ([email protected]);
(15) Alex Endert, Georgia Institute of Technology, Atlanta, Georgia, USA ([email protected]);
(16) Scott Davidoff, Jet Propulsion Laboratory, California Institute of Technology, Pasadena, California, USA ([email protected]);
(17) Victoria J. Orphan, Division of Geological and Planetary Sciences, California Institute of Technology, Pasadena, California, USA ([email protected]).