2019-06-03-denoising

Posted on June 3, 2019

Wiki

  • white noise
    • white refers to the way signal power is distributed (independently) over time or along frequencies
    • white noise vector if each component has probability distribution with mean 0 and a finite variance, and are statistically independent
  • gaussian white noise
    • white noise vector where each component assumes a gaussian distribution
      • the vector is a multivariate gaussian distribution
  • nonlocal means: https://en.wikipedia.org/wiki/Non-local_means
    • weighted sum of all pixels’ local mean filter
  • bilateral filter: https://en.wikipedia.org/wiki/Bilateral_filter
  • total variation: https://en.wikipedia.org/wiki/Total_variation_denoising
    • denoising but edge perserving
    • 2D problem harder to solve (primal dual: https://link.springer.com/content/pdf/10.1023%2FB%3AJMIV.0000011325.36760.1e.pdf)

Denoising

  • 1992_nonlinear_total_variantion_based_noise_removal_algorithms
    • abstract
      • denoising by minimizing total variations, constraint using Lagrange multipliers
      • solved by gradient-projection
    • intro
      • total variation norm
        • L1 norms of derivatives
        • nonlinear, computationall complex (compared to L2)
      • problem formulation
        • constraint optimization as time dependent nonlinear parabolic pde
          • derive the euler-lagrange equation for the optimization
        • solved with with a time-steping algorithm
    • results
      • denoise in flat regions AND preserves edge details
    • some follow-ups
      • primal-dual method for minimizing total variation
        • https://www.uni-muenster.de/AMM/num/Vorlesungen/MathemBV_SS16/literature/Chambolle2004.pdf
  • 1998_bilateral_filtering_for_gray_and_color_images
    • bilateral filtering
  • 2008_nonlocal_image_and_movie_denoising
    • nonlocal mean filtering
  • 2010_fast_image_recovery_using_variable_splitting_and_constraint_optimization
    • abstract
      • using ADMM for solving unconstrained problem where objective includes
        • L2 data-fidelity term
        • non-smooth regularizer
      • variable splitting
        • equivalent to constrained optimization, addressed with augmented Lagrangian method
    • problem formulation
      • synthesis approach (i.e. with wavelets)
        • x = W\beta where
          • W are elements of wavelet frame
          • \beta are parameter to be estimated
      • analysis approach
        • x sampled randomly
        • based on regularizers that analyzes the image itself rather than representation in wavelet domain
          • i.e total variation regularizer
      • unified view
        • min_x 1/2 || Ax - y ||_2^2 + \tau \phi(x)
          • A = BW where A=B for analysis approach
        • solvers
          • iterative shrinkage / thresholding (IST)
            • relies on denoising function \Psi(y)
            • iteration: x_{k+1} = \Psi ( x_t - 1/\gamma A^T(Ax - y))
            • problem: slow
    • proposed algo: split augmented Lagrangian shrinkage algorithm (SALSA)
      • a variant of ADMM
  • 2014_progressive_image_denoising_through_hybrid_graph_laplacian_regularization
    • abstract
      • laplacian regularized image denoising
      • semisupervised learning
    • intro
      • variational problem
        • minimize data fidelity term and regularization term
      • priors
        • locally smooth (nearby pixels more likely to have same/similar intensity values)
        • non-local self-similarity (pixels on same structure likely to have same or similar intensity)
    • graph laplacian regularized regression
      • graph laplacian regularizer
        • R(f) = \sum_{i,j} (f(x_i) - f(x_j))^2 w_{ij}
        • w_ij is edge weight which reflects affinity between two vertices x_i and x_j
          • want to design filters that is edge-preserving (bilateral filtering)

Plug and Play P3 Prior

Learning denoising prior for inverse problems

Review