[Insight-users] Computing the gradient image in the presence of masks
aviv.hurvitz at gmail.com
Wed Jun 4 04:16:25 EDT 2008
I have an image with masked-out (invalid) pixels. The masked-out pixels
aren't just on the border, there are many masked pixels all over the image
I want to compute the gradient image of this image.
The standard way to compute the gradient is to convolve with a Gaussian
kernel and then with a derivative kernel. This causes the invalid pixels'
values to "leak out" to nearby pixels. So if there were say 20% invalid
pixels in the original image, the gradient image will have maybe 80% invalid
pixels, because their values are functions of one or more nearby invalid
I don't know of a standard solution to this problem, but I've thought of two
One idea is to program a smart blurring operator that is "mask aware". It
will take a weighted average each neighborhood like a Gaussian kernel,
except that masked pixels won't be included in the average. Blurring with
this kernel won't make the masks leak out.
The other idea is to do inpainting of the masked out pixels, i.e. replace
the invalid pixels with valid values that are based on a smart interpolation
of their neighborhood. Then I can apply a standard gradient filter on the
I'd appreciate any ideas or implementation tips.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Insight-users