Edge detection is a fundamental tool used in most image processing applications to obtain information from the frames as a precursor step to feature extraction and object segmentation. This process detects outlines of an object and boundaries between objects and the background in the image. An edge-detection filter can also be used to improve the appearance of blurred or anti-aliased image streams. The basic edge-detection operator is a matrix area gradient operation that determines the level of variance between different pixels. The edge-detection operator is calculated by forming a matrix centered on a pixel chosen as the center of the matrix area. If the value of this matrix area is above a given threshold, then the middle pixel is classified as an edge. Examples of gradient-based edge detectors are Roberts, Prewitt, and Sobel operators. All the gradient-based algorithms have kernel operators that calculate the strength of the slope in directions, which are orthogonal to each other, commonly vertical and horizontal. Later, the contributions of the different components of the slopes are combined to give the total value of the edge strength. The Prewitt operator measures two components.
The vertical edge component is calculated with kernel Kx and the horizontal edge component is calculated with kernel Ky. |Kx| + |Ky| gives an indication of the intensity of the gradient in the current pixel. Prewitt horizontal and vertical operators depending on the noise characteristics of the image, edge detection results can vary. Gradient-based algorithms such as the Prewitt filter have a major drawback of being very sensitive to noise. The size of the kernel filter and coefficients are fixed and cannot be adapted to a given image.Image Processing for Edge Detection
The approaches used to remove the noise in the existing system are:
There are numerous types of convolution filters, matrix filters, smoothing, high pass, edge detection, etc… The main issue in matrix convolution is that it requires an astronomic number of computations. For example is a 800×600 image, is convolved with a 9×9 PSF (Pulse spread function), we already need almost six millions of multiplications and additions (800x600x9x9). Several strategies are useful to reduce the execution time when computing matrix convolutions and edge detection: