Kate Isaacs
banner
kisaacs.bsky.social
Kate Isaacs
@kisaacs.bsky.social
Assoc. Prof. Kahlert School of Computing & SCI Institute, The University of Utah. Data Visualization. HPC. She/Her. https://kisaacs.github.io
Some neat data visualizations in the posters too. This ACM SRC poster describes iSeeMore from VirginiaTech, a kinetic sculpture for visualizing parallel algorithms, in this case LLM computations. Hope to some day see it in person!
November 20, 2024 at 3:21 PM
Phuong Cao at NCSA visualizes 15 seconds of TCP connections drawn by Gephi (29k-ish nodes). The online version shows different cyber attacks: sc24.conference-program.com/presentation...

So many questions to ask of this data! Graph drawings get messy quick, but curiosity always has us looking!
November 20, 2024 at 3:02 PM
The Art of HPC at #SC24 houses all kinds of art and data visualizations this year. Come see in B301!

"What's going on in there?" by @cscullyallison.bsky.social , Kevin Menear, & Dmitry Duplyakin delightfully displays the kinds of science being computed on Kestrel.
November 20, 2024 at 2:37 PM
Yesterday at ProTools, Shilpika from Argonne presented their system for finding patterns in regularly sampled data about their machines, too much to store without this kind of analysis along the way, including machine visualizations in Jupyter to examine.

What more HPC vis will today bring? #SC24
November 19, 2024 at 2:24 PM
Halfway to my full set of 6 platonic solids by way of the Utah booth at #SC24 thanks to the cool folks at CHPC.
November 19, 2024 at 3:45 AM
My collaborators were tempted to skip design process steps, leading to some interventions. Yifan's sincere championing of vis, tools, & design kept these fruitful and pleasant. :)

Furthermore, designing domain-side gave us a fluid data collection to vis implementation loop!
October 17, 2024 at 2:43 AM
Too many computing problems to vis, not enough time!



Thursday #ieeevis, fourth session, *Computer Architecture* Prof. Yifan Sun presents a domain-expert led design study!



Three chip experts, with some help from me, designed an interactive vis to analyze simulations of new architecture ideas:
October 17, 2024 at 2:35 AM
While the example above is small, in computing and manufacturing contexts, Gantt chart data might have billions of events.

We collect supported tasks, how they are implemented in vis designs, and what queries need to be optimized to support them. This heat map shows prevalence in the literature:
October 17, 2024 at 2:18 AM
Thursday #ieeevis, first session, Sayef Azad Sakin is presenting "A Literature-based Visualization Task Taxonomy for Gantt Charts," in the "Short Papers: Perception “& Representation” session!



We collect visualization tasks implemented by Gantt charts:
October 17, 2024 at 2:15 AM
We observed that even strong coders found vis better suited for some tasks and that blending interactive vis and scripting could suggest alternative designs for multiple coordinated views. However, even in a domain with strong programming skills, state tracking in the notebook could be challenging.
October 16, 2024 at 4:26 AM
Our context was an interactive call tree visualization for use in tandem with a domain-specific data science library for use in Jupyter notebooks. We considered what tasks were better handled by coding or interactive vis to prioritize design elements and features:
October 16, 2024 at 4:24 AM
Wednesday #ieeevis, fourth session, Connor Scully-Allison is presenting “Design Concerns for Integrated Scripting and Interactive Visualization in Notebook Environments,” in the Scripts, Notebooks, and Provenance session!

We consider whether to support tasks via scripting or visualization:
October 16, 2024 at 4:20 AM
Through our studies with this framework, we found many different meanings conveyed by the gestures, which can sometimes convey multiple meanings at once, as well as differing gestures to convey the same meaning.
October 16, 2024 at 4:16 AM
Verbal expressions alone may be ambiguous when revisiting a transcript. Our framework combines audio transcripts with recorded cursor gestures so utterances such as “this cluster” or “look over here” retain meaning in both the transcript and in summaries:
October 16, 2024 at 4:12 AM
Look here! Wednesday #ieeevis, third session, Chang Han is presenting “A Deixis-Centered Approach for Documenting Remote Synchronous Communication around Data Visualizations,” in the Collaboration & Communication session!

We capture cursor gestures made in online collaborative meetings around data:
October 16, 2024 at 4:09 AM
Our layout was used by collaborators to create and debug their network models of computation as part of a linked view system. They used the system both to identify network specification bugs and to understand their own evolving model. You can see the demo here: ml4ai.github.io/moviz-client...
October 16, 2024 at 4:06 AM
Wednesday #ieeevis, first session, Chang Han is presenting "An Overview + Detail Layout for Visualizing Compound Graphs," in the "Short Papers: Graph, Hierarchy and Multidimensional" session!



Our layout arranges structure representations of multiple levels of a hierarchical network at once:
October 16, 2024 at 4:03 AM