Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<p>Problem with your solution as it stands is you're going to run out of stack space if there are too many errors on a single URL (> 1000 by default) due to the recursion. Also, the extra stack frames could make tracebacks hard to read (500 calls to <code>checkURL</code>). I'd rewrite it to be iterative, like so:</p> <pre><code>def checkUrl(url): # Only downloads headers, returns status code. while True: try: p = urlparse(url) conn = httplib.HTTPConnection(p.netloc) conn.request('HEAD', p.path) resp = conn.getresponse() return resp.status except IOError as e: if e.errno == 101: print "Network Error" time.sleep(1) except: raise </code></pre> <p>Also, you want the last clause in your <code>try</code> to be a bare <code>except</code> not an <code>else</code>. Your <code>else</code> only gets executed if control falls through the <code>try</code> suite, which can never happen, since the last statement of the <code>try</code> suite is <code>return</code>.</p> <p>This is very easy to change to allow a limited number of retries. Just change the <code>while True:</code> line to <code>for _ in xrange(5)</code> or however many retries you wish to accept. The function will then return <code>None</code> if it can't connect to the site after 5 attempts. You can have it return something else or raise an exception by adding <code>return</code> or <code>raise SomeException</code> at the very end of the function (indented the same as the <code>for</code> or <code>while</code> line).</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload