Please use this identifier to cite or link to this item: http://cmuir.cmu.ac.th/jspui/handle/6653943832/54374
Title: MSER based text localization for multi-language using double-threshold scheme
Authors: Chayut Wiwatcharakoses
Karn Patanukhom
Keywords: Computer Science
Engineering
Issue Date: 1-Jan-2015
Abstract: © 2015 ICST. In this paper, a region-based text localization that is robust for multiple languages is presented. Maximally Stable Extremal Regions (MSERs) are used for detecting candidates of text areas. The MSER components are grouped based on their connectivity in a feature space by using a new proposed rule for assigning the connectivity. The groups of components are classified into three classes that are text regions with high confidence, text region with low confidence, and non-text regions. A chain of text attribute constraint decision with the double-threshold scheme is developed to identify text regions. A sequence of constraint decision is designed to minimize the complexity based on short-circuit evaluation of logic operators. The regions that satisfy all strong constraints will be considered as text regions with high confidence while the regions that fail in some strong constraints but satisfy all weak constraints will be considered as text regions with low confidence. The final text regions are obtained from all text regions with high confidence and text regions with low confidence that have connectivity to text regions with high confidence. The proposed scheme is evaluated by using the natural scene images that consist of totally nine languages with different text alignments and camera views. The experiment shows that our proposed scheme can provide the satisfy results in comparison with baseline method.
URI: https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=84943339864&origin=inward
http://cmuir.cmu.ac.th/jspui/handle/6653943832/54374
Appears in Collections:CMUL: Journal Articles

Files in This Item:
There are no files associated with this item.


Items in CMUIR are protected by copyright, with all rights reserved, unless otherwise indicated.