Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    primarykey
    data
    text
    <pre><code>#!/usr/bin/env python """Start process; wait 2 seconds; kill the process; print all process output.""" import subprocess import tempfile import time def main(): # open temporary file (it automatically deleted when it is closed) # `Popen` requires `f.fileno()` so `SpooledTemporaryFile` adds nothing here f = tempfile.TemporaryFile() # start process, redirect stdout p = subprocess.Popen(["top"], stdout=f) # wait 2 seconds time.sleep(2) # kill process #NOTE: if it doesn't kill the process then `p.wait()` blocks forever p.terminate() p.wait() # wait for the process to terminate otherwise the output is garbled # print saved output f.seek(0) # rewind to the beginning of the file print f.read(), f.close() if __name__=="__main__": main() </code></pre> <h1>Tail-like Solutions that print only the portion of the output</h1> <p>You could read the process output in another thread and save the required number of the last lines in a queue:</p> <pre><code>import collections import subprocess import time import threading def read_output(process, append): for line in iter(process.stdout.readline, ""): append(line) def main(): # start process, redirect stdout process = subprocess.Popen(["top"], stdout=subprocess.PIPE, close_fds=True) try: # save last `number_of_lines` lines of the process output number_of_lines = 200 q = collections.deque(maxlen=number_of_lines) # atomic .append() t = threading.Thread(target=read_output, args=(process, q.append)) t.daemon = True t.start() # time.sleep(2) finally: process.terminate() #NOTE: it doesn't ensure the process termination # print saved lines print ''.join(q) if __name__=="__main__": main() </code></pre> <p>This variant requires <code>q.append()</code> to be atomic operation. Otherwise the output might be corrupted.</p> <h2><a href="http://docs.python.org/library/signal.html#signal.alarm" rel="noreferrer"><code>signal.alarm()</code></a> solution</h2> <p>You could use <a href="http://docs.python.org/library/signal.html#signal.alarm" rel="noreferrer"><code>signal.alarm()</code></a> to call the <code>process.terminate()</code> after specified timeout instead of reading in another thread. Though it might not interact very well with the <code>subprocess</code> module. Based on <a href="https://stackoverflow.com/questions/1191374/subprocess-with-timeout/1191537#1191537">@Alex Martelli's answer</a>:</p> <pre><code>import collections import signal import subprocess class Alarm(Exception): pass def alarm_handler(signum, frame): raise Alarm def main(): # start process, redirect stdout process = subprocess.Popen(["top"], stdout=subprocess.PIPE, close_fds=True) # set signal handler signal.signal(signal.SIGALRM, alarm_handler) signal.alarm(2) # produce SIGALRM in 2 seconds try: # save last `number_of_lines` lines of the process output number_of_lines = 200 q = collections.deque(maxlen=number_of_lines) for line in iter(process.stdout.readline, ""): q.append(line) signal.alarm(0) # cancel alarm except Alarm: process.terminate() finally: # print saved lines print ''.join(q) if __name__=="__main__": main() </code></pre> <p>This approach works only on *nix systems. It might block if <code>process.stdout.readline()</code> doesn't return.</p> <h2><a href="http://docs.python.org/library/threading.html#threading.Timer" rel="noreferrer"><code>threading.Timer</code></a> solution</h2> <pre><code>import collections import subprocess import threading def main(): # start process, redirect stdout process = subprocess.Popen(["top"], stdout=subprocess.PIPE, close_fds=True) # terminate process in timeout seconds timeout = 2 # seconds timer = threading.Timer(timeout, process.terminate) timer.start() # save last `number_of_lines` lines of the process output number_of_lines = 200 q = collections.deque(process.stdout, maxlen=number_of_lines) timer.cancel() # print saved lines print ''.join(q), if __name__=="__main__": main() </code></pre> <p>This approach should also work on Windows. Here I've used <code>process.stdout</code> as an iterable; it might introduce an additional output buffering, you could switch to the <code>iter(process.stdout.readline, "")</code> approach if it is not desirable. if the process doesn't terminate on <code>process.terminate()</code> then the scripts hangs.</p> <h2>No threads, no signals solution</h2> <pre><code>import collections import subprocess import sys import time def main(): args = sys.argv[1:] if not args: args = ['top'] # start process, redirect stdout process = subprocess.Popen(args, stdout=subprocess.PIPE, close_fds=True) # save last `number_of_lines` lines of the process output number_of_lines = 200 q = collections.deque(maxlen=number_of_lines) timeout = 2 # seconds now = start = time.time() while (now - start) &lt; timeout: line = process.stdout.readline() if not line: break q.append(line) now = time.time() else: # on timeout process.terminate() # print saved lines print ''.join(q), if __name__=="__main__": main() </code></pre> <p>This variant use neither threads, no signals but it produces garbled output in the terminal. It will block if <code>process.stdout.readline()</code> blocks.</p>
    singulars
    1. This table or related slice is empty.
    plurals
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. VO
      singulars
      1. This table or related slice is empty.
    2. VO
      singulars
      1. This table or related slice is empty.
    3. VO
      singulars
      1. This table or related slice is empty.
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload