Edge Preserving Image Denoising in Reproducing Kernel Hilbert Spaces

Reading time: 6 minute
...

📝 Original Info

  • Title: Edge Preserving Image Denoising in Reproducing Kernel Hilbert Spaces
  • ArXiv ID: 1011.5962
  • Date: 2010-11-30
  • Authors: P. Bouboulis, K. Slavakis, S. Theodoridis —

📝 Abstract

The goal of this paper is the development of a novel approach for the problem of Noise Removal, based on the theory of Reproducing Kernels Hilbert Spaces (RKHS). The problem is cast as an optimization task in a RKHS, by taking advantage of the celebrated semiparametric Representer Theorem. Examples verify that in the presence of gaussian noise the proposed method performs relatively well compared to wavelet based technics and outperforms them significantly in the presence of impulse or mixed noise. A more detailed version of this work has been published in the IEEE Trans. Im. Proc. : P. Bouboulis, K. Slavakis and S. Theodoridis, Adaptive Kernel-based Image Denoising employing Semi-Parametric Regularization, IEEE Transactions on Image Processing, vol 19(6), 2010, 1465 - 1479.

💡 Deep Analysis

Figure 1

📄 Full Content

The problem of noise removal from a digitized image is one of the most fundamental ones in digital image processing. So far, various techniques have been proposed to deal with it. Among the most important methodologies are, for example, the Wavelet-based image denoising methods, which dominates the research in recent years [2,3]. In this paper we propose a novel approach which (to our knowledge) has not been considered before. We employ the well known powerful tool of kernels.

In kernel methodology the notion of the Reproducing Kernel Hilbert Space (RKHS) plays a crucial role. A RKHS, is a rich construct (roughly, a smooth space with an inner product), which has been proven to be a very powerful tool for non linear processing [9,11]. In the denoising problem, we exploit a useful property of RKHS, the representer theorem [9]. It states that the minimizer of any optimization task in H, with a cost function of a certain type, has a finite representation in H. We recast the image denoising problem as an optimization task of this type and use the semi-parametric version of the representer theorem. The latter, allows for explicit modeling of the edges in an image. In such a way we can deal with the smoothness which is, implicitly, imposed by the “smooth” nature of RKHS.

Though there has been some work exploring the use of kernels in the denoising problem, the methodology presented here is fundamentally different. In [10], the notion of kernel regression has been adopted. The original image is formulated as a Taylor approximation series around a center, x i , and data adaptive kernels are used, as weighted factors, to penalize distances away from x i . In a relatively similar context, kernels have been employed by other well known denoising methods (such as [1]). Kernels were also used in the context of RKHS in [6,5]. However, the obtained results were not satisfying, especially around edges. It is exactly this drawback that is addressed by our method.

We start with some basic definitions regarding RKHS. Let X be a non empty set with x 1 , . . . , x N ∈ X. Consider a Hilbert space H of real valued functions f defined on a set X, with a corresponding inner product •, • H . We will call H as a Reproducing Kernel Hilbert Space -RKHS, if there exists a function, known as kernel, κ : X × X → R with the following two properties:

  1. For every x ∈ X, κ(x, •) belongs to H.

  2. κ has the so called reproducing property, i.e.

In can been shown that the kernel κ produces the entire space H, i.e. H = span{κ(x, •)|x ∈ X}. There are several kernels that are used in practice (see [9]). In this work, we focus on one of the most widely used, the

due to some additional properties that it admits. One of the many powerful tools in kernel theory is the application of the semi-parametric representer theorem to regularized risk minimization problems (see [9]): Theorem 2.1. Denote by Ω 1 , Ω 2 : [0, ∞) → R, two strictly monotonic increasing functions, by X a set and by c :

Usually the regularization term Ω(f ) takes the form Ω(f ) = 1 2 f 2 H . In the case of the RKHS produced by the gaussian Kernel we can prove that

with O 2n = ∆ n and O 2n+1 = ∇∆ n , ∆ being the Laplacian and ∇ the gradient operator (see [9]). Thus, we see that the regularization term “penalizes” the derivatives of the minimizer. This results to a very smooth solution of the regularized risk minimization problem.

Note that according to theorem 2.1 the model of a function has two parts, one lying in the smooth RKHS space and another part h which gives rise to the second term in the expansion (1). It is exactly this term that is exploited by our method in order to explicitly model edges.

Let f be the original image and f the noisy one (we consider them as continuous functions). Also, let f i,j and fi,j be the restrictions of f and f on the N × N orthogonal region centered at the pixel (i, j) of each image accordingly (N is an odd number). Our task is to find f i,j from the given samples of fi,j . For simplicity, we drop the i, j indices and consider f i,j and fi,j (which from now on will be written as f and f ) as functions defined on [0, 1] 2 (and zero elsewhere). The pixel values of the digitized image are given by f (x n , y m ) and f (x n , y m ) where x n = n/(N -1), y m = m/(N -1)

for n, m = 0, 1, . . . , N -1.

We consider a set of real valued functions {ψ k , k = 1, . . . , K} with two variables suitable to represent edges; i.e., bivariate polynomials (which are controlled by the coefficients h 0 , h 1 , h 2 , h 3 ) and functions of the form Erf(a • x + b • y + c), where Erf is the error function,

for several suitable choices of a, b and c (see figure 1). Thus we formulate the regularized risk minimization problem as follows:

Taking a closer look at the term λ 2 f 2 H according to equation ( 2), one sees that we actually penalize the derivatives of f in a more influential fashion than the total variation scheme, which is often used in wavelet-based de

📸 Image Gallery

cover.png

Reference

This content is AI-processed based on open access ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut