New Mexico Supercomputing Challenge

Eye Gaze Behavior in Autonomous Sociable Robots

Team: 127

School: School of Dreams Academy

Area of Science: Computer Science

Interim: Problem Definition:
The basis of this project is to develop a robot that will interact with people using natural face-to-face communication through both verbal and nonverbal cues. One of the most important aspects in regulating face-to-face interaction is eye gaze behavior. Eye gaze behavior refers to an individual’s gaze pattern when looking at and communicating with another person. For this work, we consider two kinds of eye gaze: looking at and looking away from a person with whom the robot is interacting. Looking at eye gaze behavior includes observation gaze, when the listening social interaction partner is looking at their interaction counterpart without reciprocated eye gaze; and mutual gaze, when both interaction partners are looking at each other maintaining eye contact. Looking away eye gaze behavior refers to an averted gaze, when one partner breaks eye contact, typically to think of what to say next or to process information.
The goal of this project is to develop a sociable robot eye gaze system that, depending on different social contexts, will recognize the eye gaze behavior of its human interaction partner, and then make one of two decisions: to look at its interaction partner, or to look away from its interaction partner. The system must analyze the surrounding conditions and balances between socially acceptable and functional in face-to-face human robot interaction (HRI).

Problem solution:
The sociable robot eye gaze system must mimic the social behaviors of typical human-human interaction while also maintaining a certain functionality. We must start by analyzing people using natural face-to-face communication through both verbal and nonverbal cues then model the data set so that a robot could interact in the same way. The model would be responsible for recognizing the factors that affect eye gaze behaviors such as, the distance between the partners, the genders of the people communicating, and their interest in the topic discussed. Based on said conditions, the model would then implement to best eye gaze behavior, looking at or looking away. Afterwards, when the model is implemented on the robot platform, it will be put into a social setting to test the social acceptability of the model.

Progress to date:
Currently, the preliminary data has been collected from human-human interactions and it is going through annotation. The typical social cues, such as distance, hand gestures, body language, and eye positions are all being taken into account. I will first look at the percentage of time each partner spends looking at and looking away and construct a simple model with that data. With this model, I can run some preliminary tests and determine which modeling technique would be best to use. So far, I have narrowed it down to Latent-Dynamic Conditional Random Fields (LDRFC) or Hidden Markov Model with cross validation. This model will give a base line to start with and I can later program in more complex behaviors for the robot to take into account.

Expected results:
When developing a robot that is both socially acceptable and functional, it is expected that the human interaction partner will: prefer a robot that is socially acceptable to non-socially acceptable, prefer a robot that is functional over nonfunctional, and prefer a socially acceptable robot to a functional robot. After testing and refining the model into a socially acceptable platform it can be tested for the predicted outcomes as well as evaluating findings for behavioral, subjective, objective, and qualitative results. The resulting sociable robotic eye gaze behavior will help in the human-robot interaction field by making robots more acceptable and human-like. The model could also be expanding into taking more complex conditions into mind to construct a more intelligent predictive model.

Works Cited:
Aiello, J.R. (1972). A test of equilibrium theory: Visual interaction in relation to orientation,distance and sex of interaction. Pshychonomic Science, 27, 335-336

Aiello, J.R. (1977a). Visual interaction at extended distances. Personality and Social Physcology Bulletin, 3, 83-86

Argyle ,M., & Cook, M. (1976). Gaze and Mutual Gaze. Cambridge, UK: Cambridge University Press

Argyle, Michael, and Janet Dean (1965). Eye-Contact, Distance and Affiliation. Sociometry, 28, 289-304

Exline, R. V., & Messick, D. (1967). The effects of dependency and social reinforcement upon

visual behavior during an interview. British Journal of Social and Clinical Psychology,6, 256-266

Kendon, A. (1967). Some functions of gaze direction in social interaction. Acta Psycologica, 26, 22-63

Team Members:

  Chloe Grubb

Sponsoring Teacher: Talysa Ogas

Mail the entire Team

For questions about the Supercomputing Challenge, a 501(c)3 organization, contact us at: consult1415 @

New Mexico Supercomputing Challenge, Inc.
Post Office Box 30102
Albuquerque, New Mexico 87190
(505) 667-2864

Supercomputing Challenge Board of Directors
Board page listing meetings and agendas
If you have volunteered for the Challenge, please fill out our In Kind form.
Flag Counter

Tweet #SupercomputingChallenge