Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<p>The logarithm is the inverse operation of exponentiation. An example of exponentiation is when you double the number of items at each step. Thus, a logarithmic algorithm often halves the number of items at each step. For example, binary search falls into this category.</p> <p>Many algorithms require a logarithmic number of big steps, but each big step requires O(n) units of work. Mergesort falls into this category.</p> <p>Usually you can identify these kinds of problems by visualizing them as a balanced binary tree. For example, here's merge sort:</p> <pre><code> 6 2 0 4 1 3 7 5 2 6 0 4 1 3 5 7 0 2 4 6 1 3 5 7 0 1 2 3 4 5 6 7 </code></pre> <p>At the top is the input, as leaves of the tree. The algorithm creates a new node by sorting the two nodes above it. We know the height of a balanced binary tree is O(log n) so there are O(log n) big steps. However, creating each new row takes O(n) work. O(log n) big steps of O(n) work each means that mergesort is O(n log n) overall.</p> <p>Generally, O(log n) algorithms look like the function below. They get to discard half of the data at each step.</p> <pre><code>def function(data, n): if n &lt;= constant: return do_simple_case(data, n) if some_condition(): function(data[:n/2], n / 2) # Recurse on first half of data else: function(data[n/2:], n - n / 2) # Recurse on second half of data </code></pre> <p>While O(n log n) algorithms look like the function below. They also split the data in half, but they need to consider both halves.</p> <pre><code>def function(data, n): if n &lt;= constant: return do_simple_case(data, n) part1 = function(data[n/2:], n / 2) # Recurse on first half of data part2 = function(data[:n/2], n - n / 2) # Recurse on second half of data return combine(part1, part2) </code></pre> <p>Where do_simple_case() takes O(1) time and combine() takes no more than O(n) time.</p> <p>The algorithms don't need to split the data exactly in half. They could split it into one-third and two-thirds, and that would be fine. For average-case performance, splitting it in half on average is sufficient (like QuickSort). As long as the recursion is done on pieces of (n/something) and (n - n/something), it's okay. If it's breaking it into (k) and (n-k) then the height of the tree will be O(n) and not O(log n).</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload