New Mexico Supercomputing Challenge

Sign Language Recognition on Google Glass

Team: 51

School: Los Alamos High

Area of Science: Computer Vision


Interim: Problem Definition:
Modern advancements in image processing and analysis offer new avenues for expression and understanding of the world in software. The communication barrier that exists between deaf Americans who use sign language, for example, can be overcome by applications that are able to identify and translate sign language to spoken or written English. The primary issue with these programs is that they difficult to apply when they run exclusively on immobile hardware.
Problem solution:
In recent years, new head mounted hardware (especially Google Glass) offer innovative and interesting applications for software that were not previously available. We've been looking into OpenCV, and we want to use their machine learning and signal processing libraries. Specifically, we’ll be using Hidden Markov Models and Histogram of Oriented Gradient methods to achieve our goals. We intend to start by isolating people, and specifically their hands, from crowded scenes. We will almost certainly be using Java to interface with the glass, but will most likely be using c for the computations. There has been some discussion of running calculations on a substantially more powerful handheld device (such as a smartphone) and using bluetooth to communicate with the glass.
Progress to Date:
At the start of this project, we were already somewhat familiar with Hidden Markov Models and Histograms of Oriented Gradients, as well as the basic capabilities of OpenCV, but we chose to start by further exploring these methods as well as researching whether the methodologies we chose were the most optimal with which to solve our problem.
We have also built and tested some basic programs on the Google Glass in order to fully test its capabilities and set up for more advanced programs.
Expected Results:
We hope to be able to reliably identify hands in crowded scenes and identify the gestures on their hands. Once the gestures are identified their meaning will be translated from ASL to English on the glass screen.


Team Members:

  Sudeep Dasari
  Colin Redman

Sponsoring Teacher: Adam Drew

Mail the entire Team

For questions about the Supercomputing Challenge, a 501(c)3 organization, contact us at: consult1415 @ supercomputingchallenge.org

New Mexico Supercomputing Challenge, Inc.
Post Office Box 30102
Albuquerque, New Mexico 87190
(505) 667-2864

Supercomputing Challenge Board of Directors
Board page listing meetings and agendas
If you have volunteered for the Challenge, please fill out our In Kind form.
Flag Counter

Tweet #SupercomputingChallenge