“This is the 20th day of my participation in the First Challenge 2022. For details: First Challenge 2022”

Histogram comparison

We have learned how to use the cv2.calchist () function to compute histograms and some examples of how to use histograms in Gray-scale Histograms in Detail.

In this section, we will introduce another histogram related function provided by OpenCV, cv2.compareHist(), which can be used to calculate the match between two histograms. Because the histogram reflects the intensity distribution of pixel values in the image, this function can also be used to compare images, but because the histogram only shows statistics, not the location of the pixel. Therefore, the common method of image comparison is to divide the image into a certain number of regions (usually of the same size), calculate the histogram of each region, and finally connect all the histograms to create the feature representation of the image. For simplicity, the example uses only one region (the full image) and does not divide the image into multiple regions.

Cv2.com pareHist()

cv2.compareHist(H1, H2, method)
Copy the code

Here, H1 and H2 are the histograms being compared, and method represents the measure. OpenCV provides four different methods to calculate the degree of match:

Measurement method explain
cv2.HISTCMP_CORREL The correlation between the two histograms is calculated, and this metric returns values in the range [-1, 1], where 1 represents a perfect match and -1 represents a complete mismatch
cv2.HISTCMP_CHISQR The chi-square distance between two histograms was calculated. This index returned a value in the range of [0, unbounded], where 0 represented a perfect match, and unbounded represented a mismatch
cv2.HISTCMP_INTERSECT Computes the intersection between two histograms, and if the histogram is normalized, this metric returns a value within the range [0, 1], where 1 represents a perfect match and 0 represents a complete mismatch
cv2.HISTCMP_BHATTACHARYYA Compute the Bhattacharyya distance between two histograms. This metric returns values in the range [0, 1], where 0 is a perfect match and 1 is a complete mismatch

To compare the different measures, we start with the images and transform them, and then use all the measures to calculate the similarity between these images and the test images.

# load image
image = cv2.imread('example.png')
# Convert to grayscale image
gray_image = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)

M = np.ones(gray_image.shape, dtype='uint8') * 30
Add 30 to all pixel values
added_image = cv2.add(gray_image, M)
Subtract 30 from all pixel values
subtracted_image = cv2.subtract(gray_image, M)
Use a blur filter
blurred_image = cv2.blur(gray_image, (10.10))
def load_all_test_images() :
    images = []
    images.append(gray_image)
    images.append(added_image)
    images.append(subtracted_image)
    images.append(blurred_image)
    return images
Copy the code

Four different measures were used to calculate the similarity between these images and the test images:

for img in test_images:
    # Calculate the histogram
    hist = cv2.calcHist([img], [0].None[256], [0.256])
    # Histogram normalization
    hist = cv2.normalize(hist, hist, norm_type=cv2.NORM_L1)
    hists.append(hist)
# Use the Cv2. HISTCMP_CORREL metric
gray_gray = cv2.compareHist(hists[0], hists[1], cv2.HISTCMP_CORREL)
gray_grayblurred = cv2.compareHist(hists[0], hists[1], cv2.HISTCMP_CORREL)
gray_addedgray = cv2.compareHist(hists[0], hists[2], cv2.HISTCMP_CORREL)
gray_subgray = cv2.compareHist(hists[0], hists[3], cv2.HISTCMP_CORREL)
# Use the Cv2. HISTCMP_CHISQR metric
gray_gray = cv2.compareHist(hists[0], hists[0], cv2.HISTCMP_CHISQR)
gray_grayblurred = cv2.compareHist(hists[0], hists[1], cv2.HISTCMP_CHISQR)
gray_addedgray = cv2.compareHist(hists[0], hists[2], cv2.HISTCMP_CHISQR)
gray_subgray = cv2.compareHist(hists[0], hists[3], cv2.HISTCMP_CHISQR)
# use the Cv2. HISTCMP_INTERSECT metric method
gray_gray = cv2.compareHist(hists[0], hists[0], cv2.HISTCMP_INTERSECT)
gray_grayblurred = cv2.compareHist(hists[0], hists[1], cv2.HISTCMP_INTERSECT)
gray_addedgray = cv2.compareHist(hists[0], hists[2], cv2.HISTCMP_INTERSECT)
gray_subgray = cv2.compareHist(hists[0], hists[3], cv2.HISTCMP_INTERSECT)
# Use the Cv2. HISTCMP_BHATTACHARYYA metric
gray_gray = cv2.compareHist(hists[0], hists[0], cv2.HISTCMP_BHATTACHARYYA)
gray_grayblurred = cv2.compareHist(hists[0], hists[1], cv2.HISTCMP_BHATTACHARYYA)
gray_addedgray = cv2.compareHist(hists[0], hists[2], cv2.HISTCMP_BHATTACHARYYA)
gray_subgray = cv2.compareHist(hists[0], hists[3], cv2.HISTCMP_BHATTACHARYYA)
Copy the code

The output of the program looks like this:

As you can see from the figure above, img 1 all match perfectly because it is the same image, img 2 also has a good match because IMG 2 is a smooth version of the query image, while IMG 3 and IMG 4 give poor match because the histogram is offset.

A link to the

Basic concept of OpenCV histogram

OpenCV gray histogram details

OpenCV color histogram with custom histogram visualization