Diagonally weighted and shifted criteria for minor and principal component extraction

Research output: Chapter in Book/Report/Conference proceedingConference contribution

17 Scopus citations

Abstract

A framework for a class of minor and principal component learning rules is presented. These rules compute multiple eigenvectors and not only a basis for a multi-dimensional eigenspace. Several MCA/PCA cost functions which are weighted or shifted by a diagonal matrix are optimized subject to orthogonal or symmetric constraints. A number of minor and principal component learning rules for symmetric matrices and matrix pencils, many of which are new, are obtained by exploiting symmetry of constrained criteria. These algorithms may be seen as the counterparts or generalization of Oja's and Xu's systems for computing multiple principal component analyzers. Procedures for converting minor component flows into principal component flows are also discussed.

Original languageEnglish (US)
Title of host publicationProceedings of the International Joint Conference on Neural Networks, IJCNN 2005
Pages1251-1256
Number of pages6
DOIs
StatePublished - 2005
EventInternational Joint Conference on Neural Networks, IJCNN 2005 - Montreal, QC, Canada
Duration: Jul 31 2005Aug 4 2005

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume2

Other

OtherInternational Joint Conference on Neural Networks, IJCNN 2005
Country/TerritoryCanada
CityMontreal, QC
Period7/31/058/4/05

Keywords

  • Adaptive learning algorithm
  • Diagonally shifted cost function
  • Extreme eigenvalues
  • Generalized MCA
  • Generalized PCA
  • Minor component analysis
  • Neural networks
  • Power method
  • Principal component analysis
  • Symmetric constraints

Fingerprint

Dive into the research topics of 'Diagonally weighted and shifted criteria for minor and principal component extraction'. Together they form a unique fingerprint.

Cite this