Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<p>Ok, as promised here a simple solution:</p> <p>Notation: Let <code>d_{i,j}=d_{j,i}</code> denote the <strong>squared</strong> distance between points <code>i</code> and <code>j</code>. Let <code>N</code> be the number of points. Let <code>p_i</code> denote the point <code>i</code> and <code>p_{i,k}</code> the <code>k</code>-th coordinate of the point.</p> <p>Let's hope I derive the algorithm correctely now. There will be some explanations afterwards so you can check the derivation (I hate it when many indexes appear).</p> <p>The algorithm uses dynamic programming to arrive at the correct solution</p> <pre><code>Initialization: p_{i,k} := 0 for all i and k Calculation: for i = 2 to N do sum = 0 for j = 2 to i - 1 do accu = d_{i,j} - d_{i,1} - p_{j,j-1}^2 for k = 1 to j-1 do accu = accu + 2 p_{i,k}*p_{j,k} - p_{j,k}^2 done p_{i,j} = accu / ( 2 p_{j,j-1} ) sum = sum + p_{i,j}^2 done p_{i,i-1} = sqrt( d_{i,0} - sum ) done </code></pre> <p>If I have not done any grave index mistakes (I usually do) this should do the trick.</p> <p>The idea behind this:</p> <p>We set the first point arbitrary at the origin to make our life easier. Not that for a point <code>p_i</code> we never set a coordinate <code>k</code> when <code>k &gt; i-1</code>. I.e. for the second point we only set the first coordinate, for the third point we only set first and second coordinate etc.</p> <p>Now let's assume we have coordinates for all points p_{k'} where <code>k'&lt;i</code> and we want to calculate the coordinates for <code>p_{i}</code> so that all <code>d_{i,k'}</code> are satisfied (we do not care yet about any constraints for points with <code>k&gt;k'</code>). If we write down the set of equations we have</p> <p><code>d_{i,j} = \sum_{k=1}^N (p_{i,k} - p_{j,k} )^2</code></p> <p>Because both <code>p_{i,k}</code> and <code>p_{j,k}</code> are equal to zero for <code>k&gt;k'</code> we can reduce this to:</p> <p><code>d_{i,j} = \sum_{k=1}^k' (p_{i,k} - p_{j,k} )^2</code></p> <p>Also note that by the loop invariant all <code>p_{j,k}</code> will be zero when <code>k&gt;j-1</code>. So we split this equation:</p> <p><code>d_{i,j} = \sum_{k=1}^{j-1} (p_{i,k} - p_{j,k} )^2 + \sum_{k=j}^{k'} p_{i,j}^2</code></p> <p>For the first equation we just get:</p> <p><code>d_{i,1} = \sum_{k=1}^{j-1} (p_{i,k} - p_{j,k} )^2 + \sum_{k=j}^{k'} p_i{i,1}^2</code></p> <p>This one will need some special treatment later.</p> <p>Now if we solve all binomials in the normal equation we get: </p> <p><code>d_{i,j} = \sum_{k=1}^{j-1} p_{i,k}^2 - 2 p_{i,k} p_{j,k} + p_{j,k}^2 + \sum_{k=j}^{k'} p_{i,j}^2</code></p> <p>subtract the first equation from this and you are left with:</p> <p><code>d_{i,j} - d_{i,1} = \sum_{k=1}^{j-1} p_{j,k}^2 - 2 p_{i,k} p_{j,k}</code></p> <p>for all j > 1.</p> <p>If you look at this you'll note that all squares of coordinates of p_i are gone and the only squares we need are already known. This is a set of linear equations that can easily be solved using methods from linear algebra. Actually there is one more special thing about this set of equations: The equations already are in triangular form, so you only need the final step of propagating the solutions. For the final step we are left with one single quadratic equation that we can just solve by taking one square root.</p> <p>I hope you can follow my reasoning. It's a bit late and my head is a bit spinning from those indexes.</p> <p><strong>EDIT</strong>: Yes, there were indexing mistakes. Fixed. I'll try to implement this in python when I have time in order to test it.</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload