Designing Eye Tracking Algorithm for Partner-Assisted Eye Scanning Keyboard for Physically Challenged People.
The proposed research work focuses on building a keyboard through designing an algorithm for eye movement detection using the partner-assisted scanning technique. The study covers all stages of gesture recognition, from data acquisition to eye detection and tracking, and finally classification. With the presence of many techniques to implement the gesture recognition stages, the main objective of this research work is implementing the simple and less expensive technique that produces the best possible results with a high level of accuracy. The results, finally, are compared with similar works done recently to prove the efficiency in implementation of the proposed algorithm. The system starts with the calibration phase, where a face detection algorithm is designed to detect the user’s face by a trained support vector machine. Then, features are extracted, after which tracking of the eyes is possible by skin-colour segmentation. A couple of other operations were performed. The overall system is a keyboard that works by eye movement, through the partner-assisted scanning technique. A good level of accuracy was achieved, and a couple of alternative methods were implemented and compared. This keyboard adds to the research field, with a new and novel combination of techniques for eye detection and tracking. Also, the developed keyboard helps bridge the gap between physical paralysis and leading a normal life. This system can be used as comparison with other proposed algorithms for eye detection, and might be used as a proof for the efficiency of combining a number of different techniques into one algorithm. Also, it strongly supports the effectiveness of machine learning and appearance-based algorithms.