Login using Social Account
     Continue with GoogleLogin using your credentials
The obtained descriptors in one image are to be recognized in the other image too. This is because, once the matching features are recognized, the images could be stitched based on these matching features.
Now, we are mainly going to do the following:
Use BFMatcher()
:
The Brute-Force matcher(cv2.BFMatcher()
) is simple. It takes the descriptor of one feature in the first set and is matched with all other features in the second set using some distance calculation. And the closest one is returned.
For BF matcher, first we have to create the BFMatcher object using cv2.BFMatcher()
. It takes two optional params:
First param is normType
: It specifies the distance measurement to be used. For descriptors like ORB, BRIEF, BRISK etc, cv2.NORM_HAMMING
should be used, which used Hamming distance as measurement.
Second param is a boolean variable, crossCheck
: This is false by default. If it is true, Matcher returns only those matches with value (i,j) such that i-th descriptor in set A has j-th descriptor in set B as the best match and vice-versa. That is, the two features in both sets should match each other. It provides consistent results.
Use match
method of BFMatcher
object: : By calling this method, we will be returned the best matches.
Get an object of BFMatcher()
and store it in bf
variable.
bf = cv2. << your code comes here >>(cv2.NORM_HAMMING, crossCheck=True)
Call the match
method of bf
object and pass the descriptors of the 2 images as input arguments. We will be returned a list of matches
.
matches = bf.<< your code comes here >>(des1,des2)
matches
is a list of DMatch
objects. This DMatch object has the following attributes:
Now, let us sort them in the order of their distances.
matches = sorted(matches, key = lambda x:x.distance)
Taking you to the next exercise in seconds...
Want to create exercises like this yourself? Click here.
Note - Having trouble with the assessment engine? Follow the steps listed here
Loading comments...