Please use this identifier to cite or link to this item:
http://cmuir.cmu.ac.th/jspui/handle/6653943832/68334
Title: | Deep-based openset classification technique and its application in novel food categories recognition |
Authors: | Jakramate Bootkrajang Jakarin Chawachat Eakkapap Trakulsanguan |
Authors: | Jakramate Bootkrajang Jakarin Chawachat Eakkapap Trakulsanguan |
Keywords: | Computer Science;Engineering |
Issue Date: | 1-Jan-2020 |
Abstract: | © Springer Nature Switzerland AG 2020. Being able to accurately recognise food categories from input images has many possibly useful applications such as content-based recipe searching or automatic intake calories tracking. Convolutional neural networks has been successfully applied in a number of food recognition tasks. Despite its impressive predictive performance on closed datasets, there is currently no standard mechanism for distinguishing unknown object classes from the known ones leading to invalid classification attempts even on non-food images. In this paper, we study a technique for detecting whether input images are beyond the scope of CNN's knowledge. The idea is to model the final activation vectors of data from the known classes using a data description method namely the support vector data description. We can then reject network's prediction if the activation vector of the query image is too different from the known ones as generalised by the model. Experimental results on a subset of UECFOOD100 datasets demonstrated that the proposed method was able to accurately classify instances from the known classes while also being able to satisfactorily reject the prediction of novel food image compared to two commonly used baselines. |
URI: | https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85065828833&origin=inward http://cmuir.cmu.ac.th/jspui/handle/6653943832/68334 |
ISSN: | 21945365 21945357 |
Appears in Collections: | CMUL: Journal Articles |
Files in This Item:
There are no files associated with this item.
Items in CMUIR are protected by copyright, with all rights reserved, unless otherwise indicated.