您当前所在位置: 首页 > 学者

张敏灵

  • 39浏览

  • 0点赞

  • 0收藏

  • 0分享

  • 0下载

  • 0评论

  • 引用

期刊论文

CoTrade: Confident Co-Training With Data Editing

暂无

IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics),2011,41(6):1612 - 162 | 2011年06月23日 | 10.1109/TSMCB.2011.2157998

URL:https://ieeexplore.ieee.org/document/5910412

摘要/描述

Co-training is one of the major semi-supervised learning paradigms that iteratively trains two classifiers on two different views, and uses the predictions of either classifier on the unlabeled examples to augment the training set of the other. During the co-training process, especially in initial rounds when the classifiers have only mediocre accuracy, it is quite possible that one classifier will receive labels on unlabeled examples erroneously predicted by the other classifier. Therefore, the performance of co-training style algorithms is usually unstable. In this paper, the problem of how to reliably communicate labeling information between different views is addressed by a novel co-training algorithm named COTRADE. In each labeling round, COTRADE carries out the label communication process in two steps. First, confidence of either classifier's predictions on unlabeled examples is explicitly estimated based on specific data editing techniques. Secondly, a number of predicted labels with higher confidence of either classifier are passed to the other one, where certain constraints are imposed to avoid introducing undesirable classification noise. Experiments on several real-world datasets across three domains show that COTRADE can effectively exploit unlabeled data to achieve better generalization performance.

关键词:

【免责声明】以下全部内容由[张敏灵]上传于[2020年11月30日 15时03分24秒],版权归原创者所有。本文仅代表作者本人观点,与本网站无关。本网站对文中陈述、观点判断保持中立,不对所包含内容的准确性、可靠性或完整性提供任何明示或暗示的保证。请读者仅作参考,并请自行承担全部责任。

我要评论

全部评论 0

本学者其他成果

    同领域成果