Note that there are some explanatory texts on larger screens.

plurals
  1. POTail file into message queue
    primarykey
    data
    text
    <p>I launch a process on a linux machine via python's subprocess (specifically on AWS EC2) which generates a number of files. I need to "tail -f" these files and send each of the resulting jsonified outputs to their respective AWS SQS queues. How would I go about such a task?</p> <p><strong>Edit</strong></p> <p>As suggested by this answer, <a href="https://stackoverflow.com/questions/636561/how-can-i-run-an-external-command-asynchronously-from-python/636719#636719" title="asyncproc">asyncproc</a>, and <a href="http://www.python.org/dev/peps/pep-3145/" rel="nofollow noreferrer">PEP3145</a>, I can do this with the following:</p> <pre><code>from asyncproc import Process import Queue import os import time # Substitute AWS SQS for Queue sta_queue = Queue.Queue() msg_queue = Queue.Queue() running_procs = {'status':(Process(['/usr/bin/tail', '--retry', '-f','test.sta']),sta_queue),'message':(Process(['/usr/bin/tail', '--retry', '-f', 'test.msg' ]),msg_queue)} def handle_proc(p,q): latest = p.read() if latest: # If nothing new, latest will be an empty string q.put(latest) retcode = p.wait(flags=os.WNOHANG) return retcode while len(running_procs): proc_names = running_procs.keys() for proc_name in proc_names: proc, q = running_procs[proc_name] retcode = handle_proc(proc, q) if retcode is not None: # Process finished. del running_procs[proc_name] time.sleep(1.0) print("Status queue") while not sta_queue.empty(): print(sta_queue.get()) print("Message queue") while not msg_queue.empty(): print(msg_queue.get()) </code></pre> <p>This should be sufficient, I think, unless others can provide a better answer.</p> <p><strong>More Edits</strong></p> <p>I'm overthinking the problem. Although the above works nicely, I think the simplest solution is: -check for the existence of the files -if the files exist, copy them to a bucket on AWS S3 and send a message through AWS SQS that files have been copied. Repeat every 60 seconds -consumer app polls SQS and eventually receives message that files have been copied -consumer app downloads files from S3 and replaces the previous contents with the latest contents. Repeat until job completes</p> <p>Although the whole issue of asynchronous IO in subprocess is still an issue. </p>
    singulars
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    plurals
    1. This table or related slice is empty.
    1. This table or related slice is empty.
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload