JCSE, vol. 11, no. 1, pp.1-8, 2017
DOI: http://dx.doi.org/10.5626/JCSE.2017.11.1.1
Memory-Efficient NBNN Image Classification
YoonSeok Lee and Sung-Eui Yoon
School of Computing, Korea Advanced Institute of Science and Technology (KAIST), Daejeon, Korea
Abstract: Naive Bayes nearest neighbor (NBNN) is a simple image classifier based on identifying nearest neighbors. NBNN uses
original image descriptors (e.g., SIFTs) without vector quantization for preserving the discriminative power of descriptors
and has a powerful generalization characteristic. However, it has a distinct disadvantage. Its memory requirement
can be prohibitively high while processing a large amount of data. To deal with this problem, we apply a spherical hashing
binary code embedding technique, to compactly encode data without significantly losing classification accuracy. We
also propose using an inverted index to identify nearest neighbors among binarized image descriptors. To demonstrate
the benefits of our method, we apply our method to two existing NBNN techniques with an image dataset. By using 64
bit length, we are able to reduce memory 16 times with higher runtime performance and no significant loss of classification
accuracy. This result is achieved by our compact encoding scheme for image descriptors without losing much information
from original image descriptors.
Keyword:
Image classification; NBNN; Hashing; Memory efficiency; Indexing
Full Paper: 386 Downloads, 1574 View
|