Applied Informatics Group

Evaluation of Human-System Interaction

Research Questions

To enable intelligent systems (i.e. smart homes, social robots, and virtual agents) to engage in interaction with the human user, we need to understand human multimodal communication and sequential structures in authentic interactions. Inspired by HHI, we develop (adapted) interactional models for robots, which are implemented in autonomous systems. We evaluate the users' perception and appraisal of the system and its technical performance using HRI studies. Based on our findings, we further investigate the emerging effects in long-term evaluations. A grand challenge for intelligent systems assisting and interacting with humans on a daily basis is to engage users in repeated interaction. Several factors can enrich such interactions and establish long-term companionship. In our projects we explore the different system requirements for long-term interaction and and evaluate the neccessary interactive capabilities in user studies.

Amongst others, we investigate the following influencing factors on Human-System Interaction:

  • Social cues (distance, orientation, attention, engagement)
  • Episodic memory (training progression, game statistics, user preferences)
  • Contextual knowledge (game rules, social norms)
  • Type of feedback (motivational, instructional, quantitative, qualitative)
  • Embodiment (e.g. virtual agent vs. ambient surroundings vs. anthropomorphic robot)
  • Social roles (e.g. trainer vs. companion)
  • Novelty

Our approach in general is to design social scenarios in which naive users can explore our systems and platforms in an interactive fashion. With the help of a user-centered design in our studies, we can test the influences of our mentioned factors by manipulating the experimental conditions. From our measurements, we can draw conclusions on the factors that shape the user experience with our systems.

The measurements we use across our expriments are human cues like gaze, (head) pose, orientation, voice pitch, facial expression. Furthermore, we use questionnaires to gain knowledge about the users subjective impressions on e.g. interaction enjoyment, perceived social intelligence, intuitiveness, and human-likeness. At last, we also use objective measurements like task-success, heart-rate, or excercise repetitions.

If you are interested in our studies, please have a look at the following selected publications: [1], [2], [3], [4].

Contact

Britta Wrede

Related Projects

Former Projects

Related Publications

  1. 2012 | Conference Paper | PUB-ID: 2617032
    "Can you answer questions, Flobi?": Interactionally defining a robot’s competence as a fitness instructor
    Süssenbach L, Pitsch K, Berger I, Riether N, Kummert F (2012)
    In: Proceedings of the 21th IEEE International Symposium in Robot and Human Interactive Communication (RO-MAN 2012).
    PUB | PDF
     
  2. 2011 | Journal Article | PUB-ID: 2481769
    How Can I Help? - Spatial Attention Strategies for a Receptionist Robot
    Holthaus P, Pitsch K, Wachsmuth S (2011)
    International Journal of Social Robotics 3(4): 383-393.
    PUB | PDF | DOI | WoS
     
  3. 2013 | Conference Paper | PUB-ID: 2605000
    A conversational virtual human as autonomous assistant for elderly and cognitively impaired users? Social acceptability and design considerations
    Kramer M, Yaghoubzadeh R, Kopp S, Pitsch K (2013)
    In: INFORMATIK 2013. Informatik angepasst an Mensch, Organisation und Umwelt, 16.–20. September 2013 Koblenz, Germany. Horbach M (Ed); Lecture Notes in Informatics (LNI). Proceedings, 220. Bonn: Gesellschaft für Informatik: 1105-1119.
    PUB | PDF
     
  4.  

Recent Best Paper/Poster Awards

Goal Babbling of Acoustic-Articulatory Models with Adaptive Exploration Noise
Philippsen A, Reinhart F, Wrede B (2016)
International Conference on Development and Learning and on Epigenetic Robotics (ICDL-EpiRob) 
PUB | PDF

 

Are you talking to me? Improving the robustness of dialogue systems in a multi party HRI scenario by incorporating gaze direction and lip movement of attendees
Richter V, Carlmeyer B, Lier F, Meyer zu Borgsen S, Kummert F, Wachsmuth S, Wrede B (2016)
International Conference on Human-agent Interaction (HAI) 

 

"Look at Me!": Self-Interruptions as Attention Booster?
Carlmeyer B, Schlangen D, Wrede B (2016)
International Conference on Human Agent Interaction (HAI)
PUB | DOI

 

For members