Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<p>Twitter's stream API is that it's essentially a long-running request that's left open, data is pushed into it as and when it becomes available.</p> <p>The repercussion of that is that the server will have to be able to deal with lots of concurrent open HTTP connections (one per client). A lot of existing servers don't manage that well, for example Java servlet engines assign one Thread per request which can (a) get quite expensive and (b) quickly hits the normal max-threads setting and prevents subsequent connections. </p> <p>As you guessed the Node.js model fits the idea of a streaming connection much better than say a servlet model does. Both requests and responses are exposed as streams in Node.js, but don't occupy an entire thread or process, which means that you could continue pushing data into the stream for as long as it remained open without tying up excessive resources (although this is subjective). In theory you could have a lot of concurrent open responses connected to a single process and only write to each one when necessary.</p> <p>If you haven't looked at it already the <a href="http://nodejs.org/docs/v0.4.7/api/http.html#http.ServerResponse" rel="noreferrer">HTTP docs for Node.js</a> might be useful.</p> <p>I'd also take a look at <a href="https://github.com/technoweenie/twitter-node/" rel="noreferrer">technoweenie's Twitter client</a> to see what the consumer end of that API looks like with Node.js, <a href="https://github.com/technoweenie/twitter-node/blob/master/lib/twitter-node/index.js#L104" rel="noreferrer">the stream() function in particular</a>.</p>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload