Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<p>If you don't mind a little shell scripting, and if you can get a dump of the input in the form you've shown here... and here's my sample input:</p> <pre><code>100: London Manchester 366 121: London CityA 240 144: Manchester CityA 300 150: CityA CityB 90 233: Manchester London 366 </code></pre> <p>You might be able to do something like this:</p> <pre><code>$ cat m.txt | awk '{ if ($2 &lt; $3) print $2, $3, $1; else print $3, $2, $1}' | sort CityA CityB 150: CityA London 121: CityA Manchester 144: London Manchester 100: London Manchester 233: </code></pre> <p>So that you at least have the pairs grouped together. Not sure what would be the best move from there. <hr/> Okay, here's a beast of a command line:</p> <pre><code>$ cat m.txt | awk '{ if ($2 &lt; $3) print $2, $3, $1; else print $3, $2, $1}' | (sort; echo "") | awk '{ if (fst == $1 &amp;&amp; snd == $2) { printf "%s%s", num, $3 } else { print fst, snd; fst = $1; snd = $2; num = $3} }' | grep "^[0-9]" 150:151:150:255:CityA CityB 100:233:London Manchester </code></pre> <p>where m.txt has these new contents:</p> <pre><code>100: London Manchester 366 121: London CityA 240 144: Manchester CityA 300 150: CityA CityB 90 151: CityB CityA 90 233: Manchester London 366 255: CityA CityB 90 </code></pre> <p>Perl probably would have been a better choice than awk, but here goes: First we sort the two city names and put the ID at the end of the string, which I did in the first section. Then we sort those to group pairs together, and we have to tack on an extra line for the awk script to finish up. Then, we loop over each line in the file. If we see a new pair of cities, we print the cities we previously saw, and we store the new cities and the new ID. If we see the same cities we saw last time, then we print out the ID of the previous line and the ID of this line. Finally, we grep only lines beginning with a number so that we discard non-duplicated pairs.</p> <p>If a pair occurs more than twice, you'll get a duplicate ID, but that's not such a big deal.</p> <p>Clear as mud?</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload