noise_estimation#

sunkit_image.utils.noise_estimation(img, patchsize=7, decim=0, confidence=0.999999, iterations=3)[source]#

Estimates the noise level of an image.

Additive white Gaussian noise (AWGN) is a basic noise model used in Information Theory to mimic the effect of many random processes that occur in nature.

Parameters:
  • img (numpy.ndarray) – Single Numpy image array.

  • patchsize (int, optional) – Patch size, defaults to 7.

  • decim (int, optional) – Decimation factor, defaults to 0. If you use large number, the calculation will be accelerated.

  • confidence (float, optional) – Confidence interval to determine the threshold for the weak texture. In this algorithm, this value is usually set the value very close to one. Defaults to 0.99.

  • iterations (int, optional) – Number of iterations, defaults to 3.

Returns:

dict – A dictionary containing the estimated noise levels, nlevel; threshold to extract weak texture patches at the last iteration, thresh; number of extracted weak texture patches num and the weak texture mask, mask.

Examples

>>> import numpy as np
>>> rng = np.random.default_rng(0)
>>> noisy_image_array = rng.standard_normal((100, 100))
>>> estimate = noise_estimation(noisy_image_array, patchsize=11, iterations=10)
>>> estimate["mask"]  
array([[1., 1., 1., ..., 1., 1., 0.],
    [1., 1., 1., ..., 1., 1., 0.],
    [1., 1., 1., ..., 1., 1., 0.],
    ...,
    [1., 1., 1., ..., 1., 1., 0.],
    [1., 1., 1., ..., 1., 1., 0.],
    [0., 0., 0., ..., 0., 0., 0.]])
>>> estimate["nlevel"]  
array([0.97398633])
>>> estimate["thresh"]  
array([164.21965135])
>>> estimate["num"]  
 array([8100.])

References

  • Xinhao Liu, Masayuki Tanaka and Masatoshi Okutomi Noise Level Estimation Using Weak Textured Patches of a Single Noisy Image IEEE International Conference on Image Processing (ICIP), 2012. DOI: 10.1109/ICIP.2012.6466947

  • Xinhao Liu, Masayuki Tanaka and Masatoshi Okutomi Single-Image Noise Level Estimation for Blind Denoising Noisy Image IEEE Transactions on Image Processing, Vol.22, No.12, pp.5226-5237, December, 2013. DOI: 10.1109/TIP.2013.2283400