常用粗糙集特征选择(属性约简)的算法汇总
这些算法主要建立在粗糙集工具箱Feast,MIToolbox进行实现。工具箱的安装:https://blog.csdn.net/qq_44822612/article/details/131816235
MIM, MRMR, MIFS, CMIM, JMI, DISR, CIFE, ICAP, CONDRED, CMI, RELIEF, FCBF, BETAGAMMA
以及以下各项的加权实现: MIM, CMIM, JMI, DISR, CMI
- MRMR算法
selectedIndices = feast('mrmr',10,data,labels) %mrmr算法,选择10个特征
H. Peng, F. Long, and C. Ding, “Feature selection based on mutual information criteria ofmax-dependency, max-relevance, and min-redundancy,”IEEE Trans. Pattern Anal. Mach. Intell., vol. 27, no. 8, pp. 1226–1238,Aug. 2005.
- MIFS算法
selectedIndices = feast('mifs',10,data,labels,0.7) %% 使用 beta = 0.7 的 MIFS 算法选择前 10 个特征
R. Battiti, “Using mutual information for selecting features in supervised neural net learning,” IEEETrans. Neural Netw., vol. 5, no. 4, pp. 537–550, Jul. 1994.文章来源:https://www.toymoban.com/news/detail-598627.html
- FCBF算法
selectedIndices = feast('fcbf',10,data,labels,0) %% 使用 beta = 0 的 MIFS 算法选择前 10 个特征
L. Yu and H. Liu, “Efficient feature selection via analysis of relevance and redundancy,” J. Mach. Learn. Res., vol. 5, pp. 1205–1224, 2004.文章来源地址https://www.toymoban.com/news/detail-598627.html
到了这里,关于常用粗糙集特征选择(属性约简)的算法汇总-基于Feast和MIToolbox工具箱实现的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!