Classification Using Hyperdimensional Computing: A Review

Research output: Contribution to journalReview articlepeer-review

107 Scopus citations

Abstract

Hyperdimensional (HD) computing is built upon its unique data type referred to as hypervectors. The dimension of these hypervectors is typically in the range of tens of thousands. Proposed to solve cognitive tasks, HD computing aims at calculating similarity among its data. Data transformation is realized by three operations, including addition, multiplication and permutation. Its ultra-wide data representation introduces redundancy against noise. Since information is evenly distributed over every bit of the hypervectors, HD computing is inherently robust. Additionally, due to the nature of those three operations, HD computing leads to fast learning ability, high energy efficiency and acceptable accuracy in learning and classification tasks. This paper introduces the background of HD computing, and reviews the data representation, data transformation, and similarity measurement. The orthogonality in high dimensions presents opportunities for flexible computing. To balance the tradeoff between accuracy and efficiency, strategies include but are not limited to encoding, retraining, binarization and hardware acceleration. Evaluations indicate that HD computing shows great potential in addressing problems using data in the form of letters, signals and images. HD computing especially shows significant promise to replace machine learning algorithms as a light-weight classifier in the field of internet of things (IoTs).

Original languageEnglish (US)
Article number9107175
Pages (from-to)30-47
Number of pages18
JournalIEEE Circuits and Systems Magazine
Volume20
Issue number2
DOIs
StatePublished - Apr 1 2020

Bibliographical note

Publisher Copyright:
© 2001-2012 IEEE.

Fingerprint

Dive into the research topics of 'Classification Using Hyperdimensional Computing: A Review'. Together they form a unique fingerprint.

Cite this