학술논문

A mobile application for South African Sign Language (SASL) recognition
Document Type
Conference
Source
AFRICON 2015 AFRICON, 2015. :1-5 Sep, 2015
Subject
Bioengineering
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Fields, Waves and Electromagnetics
Photonics and Electrooptics
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Training
Assistive technology
Accuracy
Support vector machines
Classification algorithms
Biological neural networks
Gesture recognition
South African Sign Language
Android
Mobile Application
Translation
Glove Based Input
Machine Learning
Neural Networks
SVM
SASL
Language
ISSN
2153-0033
Abstract
Sign Language uses manual hand and body gestures as well as non-manual facial expressions as a means of communication between deaf and hearing communities. A communication divide exists between the deaf and hearing communities, to the disadvantage of the deaf. This paper explores the design and implementation of a mobile application for South African Sign Language recognition. The application connects, via Bluetooth, to an instrumented glove developed by the University of Cape Town. The objective is to recognize the manual alphabet and manual numeric digits that have static gestures (31 signs in total). Two neural networks (one with a log-sigmoid, and one with a symmetric Elliott activation function) and a Support Vector Machine (SVM) were compared. The SVM was chosen for implementation primarily because of its high accuracy of 99% and its superior robustness. The mobile application is developed for Android and allows the user to connect to a Bluetooth glove, display and dictate the classification output and calibrate the connected glove. On a low-end smartphone, the classification time did not exceed 45ms, the memory usage did not exceed 15 MB, and the battery life during typical usage was approximately 11 hours.