Note that there are some explanatory texts on larger screens.

plurals
  1. POSurf Feature detection/matching issues in EMGU 2.4
    primarykey
    data
    text
    <p>So in my spare time I like to try and automate various games through computer vision techniques. Normally template matching with filters and pixel detection works fine with me. However, I recently decided to try my hand at navigating through a level by using feature matching. What I intended was to save a filtered image of an entire explored map. <img src="https://i.imgur.com/jvHLlh.png" alt="FullMap"></p> <p>Then to copy the Minimap from the screen every few seconds and filter it in the same manner and use Surf to match it to my full map which should hopefully give me a players current location (center of the match would be where the player is on the map). A good example of this working as intended is below(full map with found match on left, right is mini map image. <img src="https://i.imgur.com/SnUjIh.png" alt="GoodMatch"></p> <p>What I am having trouble with is the Surf Matching in the EMGU library seems to find incorrect matches in many cases. <img src="https://i.imgur.com/xgASIh.png" alt="BadMatch"> <img src="https://i.imgur.com/LGMoqh.png" alt="BadMatch2"></p> <p>Sometimes its not completely bad like below: <img src="https://i.imgur.com/4fVwkh.png" alt="WonkyMatch"></p> <p>I can kind of see whats happening is that its finding better matches for the keypoints in different locations on the map since Surf is supposed to be scale invariant. I don't know enough about the EMGU library or Surf to limit it so that it only accepts matches like the initial good one and either throws away these bad matches, or to tune it so those wonky matches are good ones instead.</p> <p>I am using the new 2.4 EMGU code base and my code for the SURF matching is below. I would really like to get it to the point so that it only returns matches that are always the same size(scaled ratio of normal minimap size to what it would be on the full map) so that I don't get some crazy shaped matches.</p> <pre><code>public Point MinimapMatch(Bitmap Minimap, Bitmap FullMap) { Image&lt;Gray, Byte&gt; modelImage = new Image&lt;Gray, byte&gt;(Minimap); Image&lt;Gray, Byte&gt; observedImage = new Image&lt;Gray, byte&gt;(FullMap); HomographyMatrix homography = null; SURFDetector surfCPU = new SURFDetector(100, false); VectorOfKeyPoint modelKeyPoints; VectorOfKeyPoint observedKeyPoints; Matrix&lt;int&gt; indices; Matrix&lt;byte&gt; mask; int k = 6; double uniquenessThreshold = 0.9; try { //extract features from the object image modelKeyPoints = surfCPU.DetectKeyPointsRaw(modelImage, null); Matrix&lt;float&gt; modelDescriptors = surfCPU.ComputeDescriptorsRaw(modelImage, null, modelKeyPoints); // extract features from the observed image observedKeyPoints = surfCPU.DetectKeyPointsRaw(observedImage, null); Matrix&lt;float&gt; observedDescriptors = surfCPU.ComputeDescriptorsRaw(observedImage, null, observedKeyPoints); BruteForceMatcher&lt;float&gt; matcher = new BruteForceMatcher&lt;float&gt;(DistanceType.L2); matcher.Add(modelDescriptors); indices = new Matrix&lt;int&gt;(observedDescriptors.Rows, k); using (Matrix&lt;float&gt; dist = new Matrix&lt;float&gt;(observedDescriptors.Rows, k)) { matcher.KnnMatch(observedDescriptors, indices, dist, k, null); mask = new Matrix&lt;byte&gt;(dist.Rows, 1); mask.SetValue(255); Features2DToolbox.VoteForUniqueness(dist, uniquenessThreshold, mask); } int nonZeroCount = CvInvoke.cvCountNonZero(mask); if (nonZeroCount &gt;= 4) { nonZeroCount = Features2DToolbox.VoteForSizeAndOrientation(modelKeyPoints, observedKeyPoints, indices, mask, 1.5, 20); if (nonZeroCount &gt;= 4) homography = Features2DToolbox.GetHomographyMatrixFromMatchedFeatures(modelKeyPoints, observedKeyPoints, indices, mask, 2); } if (homography != null) { //draw a rectangle along the projected model Rectangle rect = modelImage.ROI; PointF[] pts = new PointF[] { new PointF(rect.Left, rect.Bottom), new PointF(rect.Right, rect.Bottom), new PointF(rect.Right, rect.Top), new PointF(rect.Left, rect.Top)}; homography.ProjectPoints(pts); Array.ConvertAll&lt;PointF, Point&gt;(pts, Point.Round); Image&lt;Bgr, Byte&gt; result = Features2DToolbox.DrawMatches(modelImage, modelKeyPoints, observedImage, observedKeyPoints, indices, new Bgr(255, 255, 255), new Bgr(255, 255, 255), mask, Features2DToolbox.KeypointDrawType.DEFAULT); result.DrawPolyline(Array.ConvertAll&lt;PointF, Point&gt;(pts, Point.Round), true, new Bgr(Color.Red), 5); return new Point(Convert.ToInt32((pts[0].X + pts[1].X) / 2), Convert.ToInt32((pts[0].Y + pts[3].Y) / 2)); } } catch (Exception e) { return new Point(0, 0); } return new Point(0,0); } </code></pre>
    singulars
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    plurals
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload