Skip to main content
  • CIS
    Members: Free
    IEEE Members: Free
    Non-members: Free
    Length: 01:40:39
19 Jul 2020

This tutorial will first introduce the main randomization-based learning paradigms with closed-form solutions such as the randomization-based feedforward neural networks, randomization based recurrent neural networks and kernel ridge regression. The popular instantiation of the feedforward type called random vector functional link neural network (RVFL) originated in early 1990s. Other feedforward methods are random weight neural networks (RWNN), extreme learning machines (ELM), etc. Reservoir computing methods such as echo state networks (ESN) and liquid state machines (LSM) are randomized recurrent networks. Another paradigm is based on kernel trick such as the kernel ridge regression which includes randomization for scaling to large training data. The tutorial will also consider computational complexity with increasing scale of the classification/forecasting problems. Another randomization-based paradigm is the random forest which exhibits highly competitive performances. The tutorial will also present extensive benchmarking studies using classification and forecasting datasets

More Like This

  • IAS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $450.00
  • PES
    Members: Free
    IEEE Members: $45.00
    Non-members: $70.00
  • IAS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $450.00