An eye feature detector based on convolutional neural network

Fok Hing Chi Tivive*, Abdesselam Bouzerdoum

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

18 Citations (Scopus)

Abstract

One of the main problems when developing an eye detection and tracking system is to build a robust eye classifier that can detect the true eye patterns in complex scenes. This classification task is very challenging as the eye can appear in different locations with varying orientations and scales. Furthermore, the eye pattern varies intrinsically between ethnic groups, and with age and gender of a person. To cope better with these variations, we propose to use a bio-inspired convolutional neural network, based on the mechanism of shunting inhibition, for the detection of eye patterns in unconstrained environments. A learning algorithm is developed for the proposed neural network. Experimental results show that such network has the built-in invariant knowledge and the discriminatory power to classify input regions into eye and non-eye patterns. A classification rate of 99% is achieved by a three layer network with input size of 32 × 32 pixels.

Original languageEnglish
Title of host publicationProceedings - 8th International Symposium on Signal Processing and its Applications, ISSPA 2005
Pages90-93
Number of pages4
DOIs
Publication statusPublished - 2005
Externally publishedYes
Event8th International Symposium on Signal Processing and its Applications, ISSPA 2005 - Sydney, Australia
Duration: 28 Aug 200531 Aug 2005

Publication series

NameProceedings - 8th International Symposium on Signal Processing and its Applications, ISSPA 2005
Volume1

Conference

Conference8th International Symposium on Signal Processing and its Applications, ISSPA 2005
Country/TerritoryAustralia
CitySydney
Period28/08/0531/08/05

Fingerprint

Dive into the research topics of 'An eye feature detector based on convolutional neural network'. Together they form a unique fingerprint.

Cite this