Skip to main content

PERFORMANCE-COMPLEXITY TRADE-OFF IN LARGE DIMENSIONAL STATISTICS

Tayeb Zarrouk,Romain Couillet,Florent Chatelain,NicolasLe Bihan

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 12:39
21 Sep 2020

This article introduces a random matrix framework for the analysis of the performance-complexity trade-off in a class of machine learning algorithms, under a large dimensional data regime. Specifically, we analyze the spectral properties of a kernel random matrix K upon which a sparsity mask B is applied; this reduces the number of entries to evaluate, thereby reducing complexity, while weakening the power of statistical inference on K, thereby impeding performance. We exhibit a phase transition phenomenon below which spectral methods must fail and which is a function of the sparsity structure of B. This finds immediate applications to the fundamental limits of complexity-reduced spectral clustering as well as principal component analysis.

Value-Added Bundle(s) Including this Product

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00