Skip to main content
  • CIS
    Members: Free
    IEEE Members: Free
    Non-members: Free
    Length: 00:54:54
16 Apr 2013

Abstract:Designing a monolithic system for a large and complex learning task is hard. Divide-and-conquer is a common strategy in tackling such large and complex problems. Ensembles can be regarded an automatic approach towards automatic divide-and-conquer. Many ensemble methods, including boosting, bagging, negative correlation, etc., have been used in machine learning and data mining for many years. This talk will describe three examples of ensemble methods in multi-objective learning, online learning with concept drift, and multi-class imbalance learning. Given the important role of diversity in ensemble methods, some discussions and analysis will be given to gain a better understanding of how and when diversity may help ensemble learning.
Some materials used in the talk were based on the following papers: A Chandra and X. Yao, �Ensemble learning using multi-objectiveevolutionary algorithms�, Journal of Mathematical modeling andAlgorithms, 5(4):417-445, December 2006.
L. L. Minku and X. Yao, "DDD: A New Ensemble Approach For DealingWith Concept Drift,'' IEEE Transactions on Knowledge and DataEngineering, 24(4):619-633, April 2012.
S. Wang and X. Yao, ``Multi-Class Imbalance Problems: Analysis andPotential Solutions,'' IEEE Transactions on Systems, Man andCybernetics, Part B, 42(4):1119-1130, August 2012.

More Like This

  • IAS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $450.00
  • PES
    Members: Free
    IEEE Members: $45.00
    Non-members: $70.00
  • IAS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $450.00