Image compression using a stochastic competitive learning algorithm (SCoLA)

Abdesselam Bouzerdoum*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We introduce a new stochastic competitive learning algorithm (SCoLA) and apply it to vector quantization for image compression. In competitive learning, the training process involves presenting, simultaneously, an input vector to each of the competing neurons, which then compare the input vector to their own weight vectors and one of them is declared the winner based on some deterministic distortion measure. Here a stochastic criterion is used for selecting the winning neuron, whose weights are then updated to become more like the input vector. The performance of the new algorithm is compared to that of frequency-sensitive competitive learning (FSCL); it was found that SCoLA achieves higher peak signal-to-noise ratios (PSNR) than FSCL.

Original languageEnglish
Title of host publication6th International Symposium on Signal Processing and Its Applications, ISSPA 2001 - Proceedings; 6 Tutorials in Communications, Image Processing and Signal Analysis
PublisherIEEE Computer Society
Pages541-544
Number of pages4
ISBN (Print)0780367030, 9780780367036
DOIs
Publication statusPublished - 2001
Externally publishedYes
Event6th International Symposium on Signal Processing and Its Applications, ISSPA 2001 - Kuala Lumpur, Malaysia
Duration: 13 Aug 200116 Aug 2001

Publication series

Name6th International Symposium on Signal Processing and Its Applications, ISSPA 2001 - Proceedings; 6 Tutorials in Communications, Image Processing and Signal Analysis
Volume2

Conference

Conference6th International Symposium on Signal Processing and Its Applications, ISSPA 2001
Country/TerritoryMalaysia
CityKuala Lumpur
Period13/08/0116/08/01

Fingerprint

Dive into the research topics of 'Image compression using a stochastic competitive learning algorithm (SCoLA)'. Together they form a unique fingerprint.

Cite this