Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:16:01
10 May 2022

Quantization of signals is an integral part of modern signal processing applications, such as sensing, communication, and inference. In an attempt to maintain physical constraints while simultaneously attaining substantial performance gain, we consider systems with mixed-resolution, 1-bit quantized and continuous-valued, data. First, we describe the linear minimum mean-squared error (LMMSE) estimator and its associated mean-squared error (MSE) for the general mixed-resolution model. However, the MSE of the LMMSE requires matrix inversion, where the number of measurements defines the matrix dimensions and thus, may not be a tractable tool for optimization and system design. Therefore, we derive a closed-form analytic expression for the MSE of the LMMSE estimator under the linear Gaussian orthonormal (LGO) measurement model. We then solve the resource allocation optimization problem with the proposed tractable MSE as an objective function and under a power constraint by using a one-dimensional search. Further, we present the concept of dithering for mixed-resolution models and optimize the dithering noise. We discuss two common special cases of the LGO model: 1) scalar parameter estimation, and 2) channel estimation in multiple-input multiple-output (MIMO) communication systems with mixed analog-to-digital converters (ADCs). Simulations show that the proposed resource allocation and dithering policies provide significant performance improvement.

Tags:

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00