Faculty Lightning Talks

Faculty lightning talks will be presented Thursday, March 1, 2018, from 9:00-10:00am in Newman Library 207A.

The “System” in “Systems of Truth”

Prof. Deborah Tatar, Dept. of Computer Science

Truth is more than a series of facts. Facts are part of truth, but so too are the kinds of intellectual and value structures in which they are embedded. Trust in many of our most important systems of truth rely on an assumption of good will on the part of participants, including obedience to Grice’s overarching maximum of cooperation in language (“Make your contribution such as it is required, at the stage at which it occurs, by the accepted purpose or direction of the talk exchange in which you are engaged.”). In particular, they often violate the particular maxim of quality with its sub-maxim, “do not lie.” To attain the Cooperative Principle when Quality is in doubt, we must implement socio-technical systems that allow enhanced explorations in speech and in writing of how Grice’s other maxims (Quantity, Relation and Manner) can play out in the system.

Systematic Bias in Volunteered Geographic Information

Dr. Jacob Thebault-Spieker, Dept. of Computer Science

Our human and algorithmic understandings of the world increasingly depend on large, open information repositories like Wikipedia and OpenStreetMap. The decentralized nature of these platforms leaves room for the vagaries of human choice to influence which information is created, and subsequently our understanding of the world. Some of my work shows that contributors’ individual decisions lead to unfortunate, systematic patterns in where this information is produced. In this brief overview of some of my thesis work, I will discuss the ways in which content production bias can impact both our human and algorithmic understandings of truth.

Teaching Orwell’s 1984 in 2017

Prof. Tom Ewing, Dept. of History

Teaching Orwell’s 1984 in 2017 explores the instructor’s experience of assigning the classic work of dystopian government control in a year that began with falsification of attendance counts at a presidential inauguration. Orwell’s 1984 was assigned as required reading in spring 2017 and fall 2017 in a new course, Introduction to Data in Social Context, which asks why counting matters in distinct historical and social contexts. As a fictional exploration of the tension between memory, truth, and identity, on the one hand, and censorship, torture, and violence, on the other, this novel offered students a unique perspective to think about power dynamics in a society where every action, statement, and even emotion is being quantified, tracked, and recorded. The intense exploration of a mathematical formula, must 2+2=4 or can 2+2=5, makes this book an ideal platform to explore how truth systems are designed, contested, and enforced.

Some Data Points of Undergraduate Understanding of Systems of Truth

Prof. Steve Harrison, Dept. of Computer Science and School of Visual Arts

I teach the Creative Computing Capstone Studio. The students have created 3 different short projects so far this semester, each around the theme of “systems of truth”. The project briefs and the theme are underspecified. Observing the results, a few patterns have emerged in how “systems of truth” has been interpreted. These students are about to graduate and these patterns suggest some sense of their relationship to systems and truth.

Charity Begins at Home

Dr. Andrea Kavanaugh, Center for HCI and Dept. of Computer Science

The phrase ‘charity begins at home’ is often misunderstood, but its original meaning is that if a person is not charitable with the most intimate members of their social network, they are not likely to be charitable with others beyond it. People rely on their social networks not only for emotional support, but also for resources, such as aid, including information. People share information with members of their social network, including acquaintances in their geographic communities and local voluntary associations, such as a place of worship, school and work. As information dissemination and interaction is increasingly online and over social media, a willingness to be helpful to others with reliable information expresses itself increasingly online. Even if people are sharing misinformation in their social networks, it is more easily identifiable among members, since they know each other in offline contexts. ‘Discussion networks’ have important properties to mitigate misinformation, including measures of heterogeneity, and differing levels of knowledge on topics. A person has a sense of who in their social network is more knowledgeable than him/herself, and whose information is more trustworthy, in the sense of being more accurate and factual, consistently over time. This localized knowledge — in one’s social network and in one’s geographic community — is an important factor in building and sustaining trust in information. Trustworthy information is more likely to be recognizable among users at this localized level, and misinformation more likely to be identified and corrected. This claim supports an approach to information that is aggregated at the local level and exchanged among participants in a geographic area — a town, a city neighborhood — who know each other and the local circumstances. The local geographic level may represent the first line of defense in disseminating trustworthy information and mitigating uncharitable exchanges and untrustworthy behavior.

Exploring the influence of fake news tweets on news consumption

Prof. Dan Tamul, Dept. of Communication

This study examines how Donald Trump’s tweets about fake news influence the willingness of readers to consume news, the amount of news they consume, and the effects of the story on their perceptions of the issue. We conducted this study within a narrative persuasion context and explore how the fake news tweets interact with narrative engagement and attitudes toward story characters.

Conspiracies Online: Lessons from studying user discussions in a conspiratorial community

Prof. Tanu Mitra, Dept. of Computer Science

Social media systems have expanded the bounds of free speech and information reach. However, those very opportunities have also created new challenges pertaining to the veracity and credibility of information. In this talk, I will report ongoing work investigating one such challenge – the rapid dissemination of unsubstantiated rumors and conspiracy theories. Our investigation is guided by the following questions – How and when do these theories emerge? Who participates in them? What drives people to engage in conspiratorial discussions? So far, we have taken a large-scale quantitative data analysis approach to investigate these questions. Insights from our initial analysis can inform how new conspiracy theories and destructive rumors emerge, and what can we do to counter them?

Understanding Truth in Existing Systems

Prof. Scott McCrickard, Dept. of Computer Science
The notion of “systems of truth” is evocative, and the Systems of Truth workshop description, fellow bios, and other information serve to further stimulate thoughts on this topic. The introductory human-computer interaction class at Virginia Tech spends much of the early parts of the class encouraging computer science majors to examine and analyze existing technology use in various settings, seeking to understand how technology can have both positive and negative effects on workplaces, communication, recreation, and outdoor activities. This talk highlights some of the reactions to the workshop topic area from students in the class.