Internship Subject
PDF Version

2871 - Studying the readability of data visualizations

Professionals and citizens increasingly use data visualizations in their daily decision-making, statistical inference, learning, and communication. Research in data visualization often involves evaluating how effectively visual designs convey information. However, the concept of readability remains poorly defined in this field. We aim to clarify the processes and factors that contribute to making a visualization readable. Our work involves developing tools to measure readability and conducting experimental studies to understand how the characteristics of the viewer, the visual representation, and the task at hand influence readability.

As part of this research, you can choose from multiple possible internship topics. We can further discuss how to adapt each topic depending on your profile and interests.

  1. Implementation-oriented: Running our empirical studies involves setting up a technical environment to display visualizations, show questions, and collect data (answers, time spent on a question). When comparing multiple visualizations, other constraints also come into play, such as randomizing experimental conditions or the order of appearance of stimuli. This internship topic focuses on understanding existing frameworks for creating online user studies, checking how they support or do not support our study goals, and  creating a technical framework to facilitate the implementation of empirical studies. Your framework will be developed either new  under an open-source license or by extending an existing framework and updating it to our needs. You will deploy it to run an online study testing how presentation format affects how people rate the readability of data visualizations. (See related works 1 and 2 further below)
  2. Implementation-oriented: To run empirical studies aiming to examine what makes a visualization readable, researchers need to use different stimuli (for example, a baseline visualization and modified versions of it to assess the effect of certain parameters). In this internship, you will create a corpus of such visualizations. This important work will be used in our lab and shared with the international research community to serve as a resource for new studies. (See related works 1 and 3 further below)
  3. Interdisciplinarity-oriented: Readability has long been studied for texts, and this body of research can provide valuable knowledge to help assess readability in data visualization. In this internship, you will establish a list of methods used in educational and psychology research to evaluate text readability. You will then assess how such methods can be adapted to the context of data visualizations and whether visualization researchers have used similar evaluation approaches in the past or not. You will derive and test a novel study design, which will become a part of our readability evaluation framework. (See related works 1 and 4 further below)
  4. Climate change visualizations: Climate change visualizations are highly important to communicate the effect of human behavior on our planet and the consequences global warming has. Yet, climate change visualizations are difficult to understand and read. As part of this internship you will test what exactly makes climate change visualizations difficult to read. Is it the data encoding? Is it the type of text (jargon) used? Is it the background knowledge people have on the topic? You will conduct qualitative work (such as interviews) to explore possible reasons why climate visualizations are hard to read for people. The insights you gain can then lead to a follow-up project where you will  take existing visualizations (such as those from the IPCC), redesign them for better readability, and run empirical studies to test the resulting visualizations against the original ones.  (See related works 1 and 5 further below)

Most internships will also require a literature review. Other internship topic ideas can be discussed. All topics can lead to a future PhD position (depending on the student’s performance).