资 源 简 介
Reconstruction- and example-based super-resolution
(SR) methods are promising for restoring a high-resolution
(HR) image from low-resolution (LR) image(s). Under large
magnification, reconstruction-based methods usually fail
to hallucinate visual details while example-based methods
sometimes introduce unexpected details. Given a generic
LR image, to reconstruct a photo-realistic SR image and
to suppress artifacts in the reconstructed SR image, we
introduce a multi-scale dictionary to a novel SR method
that simultaneously integrates local and non-local priors.
The local prior suppresses artifacts by using steering kernel regression to predict the target pixel from a small local
area. The non-local prior enriches visual details by taking
a weighted average of a large neighborhood as an estimate
of the target pixel. Essentially, these two priors are complementary to each other. Experimental results demonstrate
that the proposed method can produce high quality SR recovery both quantitatively and perceptually.