Note that there are some explanatory texts on larger screens.

plurals
  1. PO
    text
    copied!<p>here's an example of processing the following csv file into a bean</p> <pre><code>headerA,headerB,headerC col1,col2,col3 </code></pre> <p>the first row (header) is ignored and the other columns are mapped directly into a 'matching' object. (this is only done this way for brevity).</p> <p>here's the job configuration using Spring Batch Out Of The Box components;</p> <pre><code>&lt;?xml version="1.0" encoding="UTF-8"?&gt; &lt;beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:batch="http://www.springframework.org/schema/batch" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd http://www.springframework.org/schema/batch http://www.springframework.org/schema/batch/spring-batch.xsd"&gt; &lt;batch:job id="fileJob"&gt; &lt;batch:step id="fileJob.step1"&gt; &lt;batch:tasklet&gt; &lt;batch:chunk reader="fileReader" writer="databaseWriter" commit-interval="10000"/&gt; &lt;/batch:tasklet&gt; &lt;/batch:step&gt; &lt;batch:validator&gt; &lt;bean class="org.springframework.batch.core.job.DefaultJobParametersValidator"&gt; &lt;property name="requiredKeys" value="fileName"/&gt; &lt;/bean&gt; &lt;/batch:validator&gt; &lt;/batch:job&gt; &lt;bean id="fileReader" class="org.springframework.batch.item.file.FlatFileItemReader" scope="step"&gt; &lt;property name="lineMapper" ref="lineMapper"/&gt; &lt;property name="resource" value="file:#{jobParameters['fileName']}"/&gt; &lt;property name="linesToSkip" value="1"/&gt; &lt;/bean&gt; &lt;bean id="lineMapper" class="org.springframework.batch.item.file.mapping.DefaultLineMapper"&gt; &lt;property name="fieldSetMapper" ref="fieldSetMapper"/&gt; &lt;property name="lineTokenizer" ref="lineTokenizer"/&gt; &lt;/bean&gt; &lt;bean id="lineTokenizer" class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer"&gt; &lt;property name="delimiter" value=","/&gt; &lt;property name="names" value="col1,col2,col3"/&gt; &lt;/bean&gt; &lt;bean id="fieldSetMapper" class="org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper"&gt; &lt;property name="targetType" value="de.incompleteco.spring.batch.domain.SimpleEntity"/&gt; &lt;/bean&gt; &lt;bean id="databaseWriter" class="org.springframework.batch.item.database.JdbcBatchItemWriter"&gt; &lt;property name="dataSource" ref="dataSource"/&gt; &lt;property name="itemSqlParameterSourceProvider"&gt; &lt;bean class="org.springframework.batch.item.database.BeanPropertyItemSqlParameterSourceProvider"/&gt; &lt;/property&gt; &lt;property name="sql" value="insert into simple_entity (col1,col2,col3) values (:col1,:col2,:col3)"/&gt; &lt;/bean&gt; &lt;/beans&gt; </code></pre> <p>there are a couple of note;</p> <ol> <li>this job needs a parameter 'fileName' to tell the fileReader where to find the file.</li> <li>there's a jobParametersValidator set to make sure the parameter is there</li> </ol> <p>here's the batch resource configuration;</p> <pre><code>&lt;?xml version="1.0" encoding="UTF-8"?&gt; &lt;beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:batch="http://www.springframework.org/schema/batch" xmlns:jdbc="http://www.springframework.org/schema/jdbc" xmlns:task="http://www.springframework.org/schema/task" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd http://www.springframework.org/schema/batch http://www.springframework.org/schema/batch/spring-batch.xsd http://www.springframework.org/schema/jdbc http://www.springframework.org/schema/jdbc/spring-jdbc.xsd http://www.springframework.org/schema/task http://www.springframework.org/schema/task/spring-task.xsd"&gt; &lt;batch:job-repository id="jobRepository"/&gt; &lt;bean id="jobExplorer" class="org.springframework.batch.core.explore.support.JobExplorerFactoryBean"&gt; &lt;property name="dataSource" ref="dataSource"/&gt; &lt;/bean&gt; &lt;bean id="jobLauncher" class="org.springframework.batch.core.launch.support.SimpleJobLauncher"&gt; &lt;property name="jobRepository" ref="jobRepository"/&gt; &lt;property name="taskExecutor" ref="taskExecutor"/&gt; &lt;/bean&gt; &lt;beans profile="junit"&gt; &lt;jdbc:embedded-database id="dataSource" type="H2"&gt; &lt;jdbc:script location="classpath:/org/springframework/batch/core/schema-h2.sql"/&gt; &lt;jdbc:script location="classpath:/META-INF/sql/schema-h2.sql"/&gt; &lt;/jdbc:embedded-database&gt; &lt;task:executor id="taskExecutor"/&gt; &lt;bean id="transactionManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager"&gt; &lt;property name="dataSource" ref="dataSource"/&gt; &lt;/bean&gt; &lt;/beans&gt; &lt;/beans&gt; </code></pre> <p>here's a unit test for it too</p> <pre><code>package de.incompleteco.spring.batch; import static org.junit.Assert.assertEquals; import static org.junit.Assert.assertTrue; import java.io.File; import java.io.FileOutputStream; import javax.sql.DataSource; import org.junit.Before; import org.junit.Test; import org.junit.runner.RunWith; import org.springframework.batch.core.ExitStatus; import org.springframework.batch.core.Job; import org.springframework.batch.core.JobExecution; import org.springframework.batch.core.JobParameters; import org.springframework.batch.core.JobParametersBuilder; import org.springframework.batch.core.explore.JobExplorer; import org.springframework.batch.core.launch.JobLauncher; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.jdbc.core.JdbcTemplate; import org.springframework.test.context.ActiveProfiles; import org.springframework.test.context.ContextConfiguration; import org.springframework.test.context.junit4.SpringJUnit4ClassRunner; @RunWith(SpringJUnit4ClassRunner.class) @ContextConfiguration({"classpath:/META-INF/spring/*-context.xml"}) @ActiveProfiles("junit") public class FileJobIntegrationTest { @Autowired private Job job; @Autowired private JobLauncher jobLauncher; @Autowired private JobExplorer jobExplorer; @Autowired private DataSource dataSource; private int recordCount = 1000000; private String fileName = System.getProperty("java.io.tmpdir") + File.separator + "test.csv"; @Before public void before() throws Exception { if (new File(fileName).exists()) { new File(fileName).delete(); }//end if } @Test public void test() throws Exception { //create a file FileOutputStream fos = new FileOutputStream(fileName); fos.write("col1,col2,col3".getBytes()); fos.flush(); for (int i=0;i&lt;=recordCount;i++) { fos.write(new String(i + "," + (i+1) + "," + (i+2) + "\n").getBytes()); fos.flush();//flush it }//end for fos.close(); //lets get the size of the file long length = new File(fileName).length(); System.out.println("file size: " + ((length / 1024) / 1024)); //execute the job JobParameters jobParameters = new JobParametersBuilder().addString("fileName",fileName).toJobParameters(); JobExecution execution = jobLauncher.run(job,jobParameters); //monitor while (jobExplorer.getJobExecution(execution.getId()).isRunning()) { Thread.sleep(1000); }//end while //load again execution = jobExplorer.getJobExecution(execution.getId()); //test assertEquals(ExitStatus.COMPLETED.getExitCode(),execution.getExitStatus().getExitCode()); //lets see what's in the database int count = new JdbcTemplate(dataSource).queryForObject("select count(*) from simple_entity", Integer.class); //test assertTrue(count == recordCount); } } </code></pre>
 

Querying!

 
Guidance

SQuiL has stopped working due to an internal error.

If you are curious you may find further information in the browser console, which is accessible through the devtools (F12).

Reload