Current Issue Cover
一种新的改进AdaBoost弱分类器训练算法

谢红跃1, 方昱春1, 蔡起运1(上海大学计算机工程与科学学院,上海 200072)

摘 要
AdaBoost是机器学习中比较流行的分类算法。通过研究弱分类器的特性,提出了两种新的弱分类器的阈值和偏置计算方法,二者可以使弱分类器识别率大于50%,从而保证在弱分类器达到一定数目的情况下,AdaBoost训练收敛。对两种阈值和偏置计算方法的仿真实验结果表明,在错分率降可接受的范围内,二者均使用较少的弱分类器便可获得高识别率的强分类器。
关键词
A New Weak Classifier Training Method for AdaBoost Algorithm

()

Abstract
AdaBoost is a very popular classification algorithm on machine leaning.By studying the characteristics of the weak classifier,this paper proposes two new methods to calculate the threshold and bias of the weak classifier.The two methods make the correct rate of weak classifier larger than 50%,assure the convergence of AdaBoost training when the weak classifier reach a certain number.Simulation experiments show when the error rate is in an acceptable range,the algorithms using fewer weak classifiers will be able to guarantee the strong classifier to maintain a high correct rate.
Keywords

订阅号|日报