Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 13:00
09 Jun 2020

Canonical correlation analysis (CCA) is a well-documented subspace learning approach widely used to seek for hidden sources common to two or multiple datasets. CCA has been applied in various learning tasks, such as dimensionality reduction, blind source separation, classification, and data fusion. Specifically, CCA aims at finding the subspaces for multi-view datasets, such that the projections of the multiple views onto the sought subspace is maximally correlated. However, simple linear projections may not be able to exploit general nonlinear projections, which motivates the development of nonlinear CCA. However, both conventional CCA and its non-linear variants do not take into consideration the data privacy, which is crucial especially when coping with personal data. To address this limitation, the present paper studies differentially private scheme for nonlinear CCA. Numerical tests on real datasets are carried out to showcase the effectiveness of the proposed algorithms.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00