Skip to main content

DEEP KERNEL LEARNING NETWORKS WITH MULTIPLE LEARNING PATHS

Ping Xu, Yue Wang, Xiang Chen, Zhi Tian

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:12:19
13 May 2022

This paper proposes deep kernel learning networks with multiple learning paths (DKL-MLP) for nonlinear function approximation. Leveraging the random feature (RF) mapping technique, kernel methods can be implemented as a two-layer neural network, at drastically reduced workload on weight training. Motivated by the representation power of the deep architecture in deep neural networks, we devise a vanilla deep kernel learning network (DKL) by applying RF mapping at each layer and learn the last layer only. To improve the learning performance of DKL, we add multiple trainable paths to DKL and develop the DKL-MLP method so that some implicit information from earlier hidden layers to the output layer can be learned. We prove that both DKL and DKL-MLP permit universal representation of a wide variety of interesting functions with arbitrarily small error and have no bad local minimum. Numerical experiments on both regression and classification tasks are provided to demonstrate the learning performance and computational efficiency of the proposed methods.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • CIS
    Members: Free
    IEEE Members: Free
    Non-members: Free
  • CIS
    Members: Free
    IEEE Members: Free
    Non-members: Free