欢迎来到:英国立博官网中文版!
报告名称:
Consistency Analysis of the Minimum Error Entropy Algorithm
报告作者:
吴 强
作者简介:
所在学校:
Middle Tennessee State University
职称:
助理教授
其他
报告时间:
2015年6月5日(周五)下午4:00-5:30
报告地点:
数统学院201报告厅
报告摘要:
Information theoretical learning (ITL) is an important research area in signal processing and machine learning. It uses concepts of entropies and divergences from information theory to substitute the conventional statistical descriptors Of variances and covariances. The empirical minimum error entropy (MEE) Algorithm is a typical approach falling into this this framework and has been successfully used in both regression and classification problems. In this talk, I will discuss the consistency analysis of the MEE algorithm. For this purpose, we introduce two types of consistency. The error entropy consistency requires the error entropy of the learned function to approximate the minimum error entropy. It holds When the bandwidth parameter tends to 0 at an appropriate rate. The regression consistency requires the learned function to approximate the Regression function. We proved that the error entropy consistency implies the regression consistency for homoskedastic models Where the noise is independent of the input variable. But for heteroskedastic models, a counterexample is constructed to show that the two types of consistency Are not necessarily coincident. A surprising result is that the regression consistency holds when the bandwidth parameter is sufficiently large. Regression consistency of two classes of special models is shown to hold. With fixed bandwidth parameter. These results illustrate the complication of the MEE algorithm.
版权所有© 英国立博官网中文版 - 英国立博中文版官网 2014
地址:湖北省武汉市武昌区友谊大道368号 邮政编码:430062
Email:stxy@hubu.edu.cn 电话:027-88662127