Multi-class batch-mode active learning for image classification

Ajay J. Joshi, Fatih Porikli, Nikolaos P Papanikolopoulos

Research output: Chapter in Book/Report/Conference proceedingConference contribution

21 Scopus citations

Abstract

Accurate image classification is crucial in many robotics and surveillance applications - for example, a vision system on a robot needs to accurately recognize the objects seen by its camera. Object recognition systems typically need a large amount of training data for satisfactory performance. The problem is particularly acute when many object categories are present. In this paper we present a batch-mode active learning framework for multi-class image classification systems. In active learning, images are to be chosen for interactive labeling, instead of passively accepting training data. Our framework addresses two important issues: i) it handles redundancy between different images which is crucial when batch-mode selection is performed; and ii) we pose batch-selection as a submodular function optimization problem that makes an inherently intractable problem efficient to solve, while having approximation guarantees. We show results on image classification data in which our approach substantially reduces the amount of training required over the baseline.

Original languageEnglish (US)
Title of host publication2010 IEEE International Conference on Robotics and Automation, ICRA 2010
Pages1873-1878
Number of pages6
DOIs
StatePublished - 2010
Event2010 IEEE International Conference on Robotics and Automation, ICRA 2010 - Anchorage, AK, United States
Duration: May 3 2010May 7 2010

Publication series

NameProceedings - IEEE International Conference on Robotics and Automation
ISSN (Print)1050-4729

Other

Other2010 IEEE International Conference on Robotics and Automation, ICRA 2010
Country/TerritoryUnited States
CityAnchorage, AK
Period5/3/105/7/10

Fingerprint

Dive into the research topics of 'Multi-class batch-mode active learning for image classification'. Together they form a unique fingerprint.

Cite this