#NoPayJan Offer - Access all CloudxLab Courses for free between 1st to 31st JanEnroll Now >>
The obtained descriptors in one image are to be recognized in the other image too. This is because, once the matching features are recognized, the images could be stitched based on these matching features.
Now, we are mainly going to do the following:
The Brute-Force matcher(
cv2.BFMatcher()) is simple. It takes the descriptor of one feature in the first set and is matched with all other features in the second set using some distance calculation. And the closest one is returned.
For BF matcher, first we have to create the BFMatcher object using
cv2.BFMatcher(). It takes two optional params:
First param is
normType: It specifies the distance measurement to be used. For descriptors like ORB, BRIEF, BRISK etc,
cv2.NORM_HAMMING should be used, which used Hamming distance as measurement.
Second param is a boolean variable,
crossCheck: This is false by default. If it is true, Matcher returns only those matches with value (i,j) such that i-th descriptor in set A has j-th descriptor in set B as the best match and vice-versa. That is, the two features in both sets should match each other. It provides consistent results.
match method of
BFMatcher object: : By calling this method, we will be returned the best matches.
Get an object of
BFMatcher() and store it in
bf = cv2. << your code comes here >>(cv2.NORM_HAMMING, crossCheck=True)
match method of
bf object and pass the descriptors of the 2 images as input arguments. We will be returned a list of
matches = bf.<< your code comes here >>(des1,des2)
matches is a list of
DMatch objects. This DMatch object has the following attributes:
Now, let us sort them in the order of their distances.
matches = sorted(matches, key = lambda x:x.distance)
No hints are availble for this assesment
Answer is not availble for this assesment
Note - Having trouble with the assessment engine? Follow the steps listed here