Date of Award


Document Type


Degree Name

Bachelor of Science



First Advisor

Dr. Barry Lawson


The goal of this work is to implement a real-time system using wearable technology for translating American Sign Language (ASL) gestures into audible form. This system could be used to facilitate conversations between individuals who do and do not communicate using ASL. We use as our source of input the Myo armband, an affordable commercially-available wearable technology equipped with on-board accelerometer, gyroscope, and electromyography sensors. We investigate the performance of two different classification algorithms in this context: linear discriminant analysis and k-Nearest Neighbors (k-NN) using various distance metrics. Using the k-NN classifier and windowed dynamic time warping as the distance metric, our working prototype is able to obtain accuracies between 94% 98% when classifying up to 20 different ASL gestures.