Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<p>The basic form of the query (for a person, for example) is </p> <pre><code>[{ "type":"/people/person", "name":None, "/common/topic/alias":[], "limit":100 }]​ </code></pre> <p>There's documentation available at <a href="http://wiki.freebase.com/wiki/MQL_Manual" rel="nofollow">http://wiki.freebase.com/wiki/MQL_Manual</a></p> <p>Using freebase.mqlreaditer() from the Python library <a href="http://code.google.com/p/freebase-python/" rel="nofollow">http://code.google.com/p/freebase-python/</a> is the easiest way to cycle through all of these. In this case, the "limit" clause determines the chunk size used for querying, but you'll get each result individually at the API level.</p> <p>BTW, how do you plan to disambiguate Jack Kennedy the president, from the hurler, from the football player, from the book, etc, etc <a href="http://www.freebase.com/search?limit=30&amp;start=0&amp;query=jack+kennedy" rel="nofollow">http://www.freebase.com/search?limit=30&amp;start=0&amp;query=jack+kennedy</a> You may want to consider capturing additional information from Freebase (birth &amp; death dates, book authors, other types assigned, etc) if you'll have enough context to be able to use it to disambiguate.</p> <p>Past a certain point, it may be easier and/or more efficient to work from the bulk data dumps rather than the API <a href="http://wiki.freebase.com/wiki/Data_dumps" rel="nofollow">http://wiki.freebase.com/wiki/Data_dumps</a></p> <p>Edit - here's a working Python program which assumes you've got a list of type IDs in a file called 'types.txt':</p> <pre><code>import freebase f = file('types.txt') for t in f: t=t.strip() q = [{'type':t, 'mid':None, 'name':None, '/common/topic/alias':[], 'limit':500, }] for r in freebase.mqlreaditer(q): print '\t'.join([t,r['mid'],r['name']]+r['/common/topic/alias']) f.close() </code></pre> <p>If you make the query much more complex, you'll probably want to lower the limit to keep from running into timeouts, but for a simple query like this, boosting the limit above the default of 100 will make it more efficient by querying in bigger chunks.</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload