Note that there are some explanatory texts on larger screens.

plurals
  1. POApache Camel : GBs of data from database routed to JMS endpoint
    primarykey
    data
    text
    <p>I've done a few small projects in camel now but one thing I'm struggling to understand is how to deal with big data (that doesn't fit into memory) when consuming in camel routes.</p> <p>I have a database containing a couple of GBs worth of data that I would like to route using camel. Obviously reading all data into memory isn't an option.</p> <p>If I were doing this as a standalone app I would have code that paged through the data and send chunks to my JMS enpoint. I'd like to use camel as it provides a nice pattern. If I were consuming from a file I could use the streaming() call. </p> <p>Also should I use camel-sql/camel-jdbc/camel-jpa or use a bean to read from my database.</p> <p>Hope everyone is still with me. I'm more familiar with the Java DSL but would appreciate any help/suggestions people can provide.</p> <p><strong>Update : 2-MAY-2012</strong></p> <p>So I've had some time to play around with this and I think what I'm actually doing is abusing the concept of a Producer so that I can use it in a route. </p> <pre><code>public class MyCustomRouteBuilder extends RouteBuilder { public void configure(){ from("timer:foo?period=60s").to("mycustomcomponent:TEST"); from("direct:msg").process(new Processor() { public void process(Exchange ex) throws Exception{ System.out.println("Receiving value" : + ex.getIn().getBody() ); } } } } </code></pre> <p>My producer looks something like the following. For clarity I've not included the CustomEndpoint or CustomComponent as it just seems to be a thin wrapper. </p> <pre><code>public class MyCustomProducer extends DefaultProducer{ Endpoint e; CamelContext c; public MyCustomProducer(Endpoint epoint){ super(endpoint) this.e = epoint; this.c = e.getCamelContext(); } public void process(Exchange ex) throws Exceptions{ Endpoint directEndpoint = c.getEndpoint("direct:msg"); ProducerTemplate t = new DefaultProducerTemplate(c); // Simulate streaming operation / chunking of BIG data. for (int i=0; i &lt;20 ; i++){ t.start(); String s ="Value " + i ; t.sendBody(directEndpoint, value) t.stop(); } } } </code></pre> <p>Firstly the above doesn't seem very clean. It seems like the cleanest way to perform this would be to populate a jms queue (in place of direct:msg) via a scheduled quartz job that my camel route then consumes so that I can have more flexibility over the message size received within my camel pipelines. However I quite liked the semantics of setting up time based activations as part of the Route. </p> <p>Does anyone have any thoughts on the best way to do this.</p>
    singulars
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    plurals
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
    1. This table or related slice is empty.
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload