Thesis Proposal: Improving Understanding, Trust, and Control with Intelligibility in Context-Aware Applications

April 22nd, 2011 § 0

I will be presenting my thesis proposal in early May 2011 about my work in providing intelligibility in context-aware applications.

When:   May 2nd, Monday @ 1.30pm
Where:  Gates-Hillman Center 6115

Improving Understanding, Trust, and Control with Intelligibility in Context-Aware Applications

Anind K. Dey (Chair), Carnegie Mellon University, Human-Computer Interaction Institute
Scott E. Hudson, Carnegie Mellon University, Human-Computer Interaction Institute
Aniket Kittur, Carnegie Mellon University, Human-Computer Interaction Institute
Margaret M. Burnett, Oregon State University


To facilitate everyday activities, context-aware applications use sensors to detect what is happening, and use increasingly complex mechanisms (e.g., by using machine learning) to infer the user’s context. For example, a mobile application can recognize that you are in a conversation, and suppress any incoming messages. When the application works well, this implicit sensing and complex inference remain invisible. However, when it behaves inappropriately or unexpectedly, users may not understand its behavior, and this can lead users to mistrust, misuse, or abandon it. To counter this, context-aware applications should be intelligible, capable of generating explanations of their behavior.

My thesis investigates providing intelligibility in context-aware applications, and evaluates its usefulness to improve user understanding, trust, and control. I explored what explanation types users want when using context-aware applications in various circumstances. I provided explanations in terms of questions that users would ask, such as why did it do X, what if I did W, what will it do? Early evaluation found that why and why not explanations can improve understanding and trust. I next developed a toolkit to help developers to implement intelligibility in their context-aware applications, such that they can automatically generate explanations. Following which, I conducted a usability study to derive design recommendations for presenting usable intelligibility interfaces of a mobile application. In the remaining work, I will evaluate intelligibility in more realistic settings. First, I shall explore the helpfulness and harmfulness of intelligibility for applications with high and low certainties. Finally, I shall investigate how intelligibility, through improving user understanding, can help the users to more effectively control a context-aware application.

January 17th, 2011 § 0

I’ve recently launched a website for the new Context Toolkit that I’ve adapted from the original one built by my advisor, Anind, years ago. Visit to learn more. There you can download v2.0 of the toolkit, and learn how to use it from tutorials there. The Intelligibility Toolkit is also now available for download as part of the Context Toolkit. Tutorials for how to use its various components are also located on the website.

Workshop on Intelligibility and Control in Pervasive Computing

December 24th, 2010 § 0

I am co-organizing a Pervasive 2011 workshop on Intelligibility and Control in Pervasive Computing with Jo Vermeulen and Fahim Kawsar to be held on June 12. The Call for Papers is out and more information on the workshop can be found at the workshop website.

Toolkit to Support Intelligibility in Context-Aware Applications.

June 21st, 2010 § 0

With a design framework in place from [Lim & Dey 2009], this work makes a technical contribution by facilitating the provision of 8 explanation types (Input, Output, What, Why, Why Not, How To, What If, Certainty) generated from commonly used decision models in context-aware applications (rules, decision tree, naïve Bayes, hidden Markov models). The Intelligibility Toolkit extends the Enactor framework [Dey & Newberger] by providing more types of explanations and supporting machine learning classifiers other than rules. We validate the toolkit with three demonstration applications showing how the explanations can be generated from various decision models.

intelligibility toolkit - architecture

» Read the rest of this entry «

Anatomy of a HCI Researcher

January 25th, 2010 § 0

A fun poster design I made for a design course in CMU. It abstractly charts my academic different interests from high school to grad school, and how I became interested in HCI. You may browse the process booklet for an explanation.


View Large On White

Captain A* – A*STAR Scholarship Video 2007

January 25th, 2010 § 0

Directed and produced the A*STAR Scholarship Video for 2007 with a crew of five fellow scholars. The video was produced to be shown as one of the entertainment pieces for the 2007 A*STAR Scholarship Awards Ceremony held on 27th July 2007, and to be used as a promotional video for the A*STAR Graduate Academy to use for publicizing the A*STAR National Science Scholarships.

Though the project did not receive funding, we managed to acquire professional cameras and software, from the Institute of Infocomm Research, to produce the work.

As an exploration of DVD video creation, I designed a DVD for the video in the DVD-Video format, augmented with a Data directory. This directory contains computer navigable video, graphics files, textual information, and a web interface. I am hosting a web interface for the video online, and the ISO image of the designed DVD. Viewers can mount or burn the image to view it.

Playground Building Video

January 25th, 2010 § 0

In an effort to learn some video editing, I cooked up this piece depicting some of us building a playground for our church, Bethel Grove Bible Church, in Ithaca. It uses two video footages and a very limited pool of photographs, and was assembled using Windows Movie Maker.

Click on the image to play the video, or right click and select “Save Link As” to save the video.