Please use this identifier to cite or link to this item: http://cmuir.cmu.ac.th/jspui/handle/6653943832/62668
Full metadata record
DC FieldValueLanguage
dc.contributor.authorJakramate Bootkrajangen_US
dc.contributor.authorJeerayut Chaijaruwanichen_US
dc.date.accessioned2018-11-29T07:39:02Z-
dc.date.available2018-11-29T07:39:02Z-
dc.date.issued2018-01-01en_US
dc.identifier.issn14337541en_US
dc.identifier.other2-s2.0-85053253900en_US
dc.identifier.other10.1007/s10044-018-0750-zen_US
dc.identifier.urihttps://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85053253900&origin=inwarden_US
dc.identifier.urihttp://cmuir.cmu.ac.th/jspui/handle/6653943832/62668-
dc.description.abstract© 2018, Springer-Verlag London Ltd., part of Springer Nature. Learning from labelled data is becoming more and more challenging due to inherent imperfection of training labels. Existing label noise-tolerant learning machines were primarily designed to tackle class-conditional noise which occurs at random, independently from input instances. However, relatively less attention was given to a more general type of label noise which is influenced by input features. In this paper, we try to address the problem of learning a classifier in the presence of instance-dependent label noise by developing a novel label noise model which is expected to capture the variation of label noise rate within a class. This is accomplished by adopting a probability density function of a mixture of Gaussians to approximate the label flipping probabilities. Experimental results demonstrate the effectiveness of the proposed method over existing approaches.en_US
dc.subjectComputer Scienceen_US
dc.titleTowards instance-dependent label noise-tolerant classification: a probabilistic approachen_US
dc.typeJournalen_US
article.title.sourcetitlePattern Analysis and Applicationsen_US
article.stream.affiliationsChiang Mai Universityen_US
Appears in Collections:CMUL: Journal Articles

Files in This Item:
There are no files associated with this item.


Items in CMUIR are protected by copyright, with all rights reserved, unless otherwise indicated.