Skip to main content

Multiple data source with Spring boot, batch and cloud task

Here we will see how we can configure different datasource for application and batch. By default, Spring batch stores the job details and execution details in database. If separate data source is not configured for spring batch then it will use the available data source in your application if configured and create batch related tables there. Which may be the unwanted burden on application database and we would like to configure separate database for spring batch.
To overcome this situation we will configure the different datasource for spring batch using in-memory database, since we don't want to store batch job details permanently.
Other thing is the configuration of  spring cloud task in case of multiple datasource and it must point to the same data source which is pointed by spring batch.
In below sections, we will se how to configure application, batch and cloud task related data sources.

Application Data Source

Define the data source in application properties or yml configuration to configure the database.
spring.datasource.url=jdbc:h2:tcp://localhost:19092/mem:app-data
spring.datasource.jdbcUrl=jdbc:h2:tcp://localhost:19092/mem:app-data
spring.datasource.driverClassName=org.h2.Driver
spring.datasource.username=sa
spring.datasource.password=

spring.jpa.database-platform=org.hibernate.dialect.H2Dialect
Define the spring beans for data source, entity manager factory and transaction manager. We also need to configure JPA repository where we tell the location of our repository classes and domain objects for business data manipulation.
@Configuration
@EnableJpaRepositories(
        entityManagerFactoryRef = "appEntityManagerFactory",
        transactionManagerRef = "appTransactionManager",
        basePackages = "com.ttj.app.repository"
)
@EnableTransactionManagement
public class AppDataSourceConfig {

    @Bean
    @ConfigurationProperties(prefix = "spring.datasource")
    public DataSource appDataSource(){
        return DataSourceBuilder.create().build();
    }

    @Bean(name = "appEntityManagerFactory")
    public LocalContainerEntityManagerFactoryBean appEntityManagerFactory(EntityManagerFactoryBuilder builder,
            @Qualifier("appDataSource") DataSource appDataSource){

        return builder
                .dataSource(appDataSource)
                .packages("com.ttj.app.domain")
                .persistenceUnit("app")
                .build();
    }

    @Bean(name = "appTransactionManager")
    public PlatformTransactionManager appTransactionManager(@Qualifier("appEntityManagerFactory") EntityManagerFactory
                                                                        appEntityManagerFactory) {

        return new JpaTransactionManager(appEntityManagerFactory);
    }
}

Spring Batch Data Source

Configure below data source properties in application properties or yml configuration. Please note if you are going to use this job as cloud task to register with spring cloud data flow server then make sure it is pointing to same database which is used by data flow server.
spring.batch.datasource.url=jdbc:h2:tcp://localhost:19092/mem:dataflow
spring.batch.datasource.jdbcUrl=jdbc:h2:tcp://localhost:19092/mem:dataflow
spring.batch.datasource.driverClassName=org.h2.Driver
spring.batch.datasource.username=sa
spring.batch.datasource.password=
Define the spring beans for batch datasource and transaction manager. Please note that here we are making the datasource and transaction manager as primary beans as spring cloud task looks for some other bean name by default for data source and transaction manager and making it primary makes cloud task to use these beans.
@Configuration
public class BatchDataSourceConfig {

    @Bean(name="batchDataSource")
    @Primary
    @ConfigurationProperties(prefix = "spring.batch.datasource")
    public DataSource batchDataSource(){
        return DataSourceBuilder.create().build();
    }

    @Primary
    @Bean(name = "batchEntityManagerFactory")
    public LocalContainerEntityManagerFactoryBean batchEntityManagerFactory(EntityManagerFactoryBuilder builder,
                                                                          @Qualifier("batchDataSource") DataSource batchDataSource){
        return builder
                .dataSource(batchDataSource)
                .packages("com.ttj.batch.domain")
                .persistenceUnit("batch")
                .build();
    }

    @Primary
    @Bean(name = "batchTransactionManager")
    public PlatformTransactionManager appTransactionManager(@Qualifier("batchEntityManagerFactory") EntityManagerFactory
                                                                        batchEntityManagerFactory) {

        return new JpaTransactionManager(batchEntityManagerFactory);
    }

    @Bean
    public BatchConfigurer configurer(@Qualifier("batchDataSource") DataSource batchDataSource) {

        return new DefaultBatchConfigurer(batchDataSource);
    }
}

Spring Cloud Task Data Source

We need to use the same data source which is configured with spring batch and need to define the bean for TaskConfigurer as given below. Also by default it looks for transaction manager with the bean name as "transactionManager" and to use the batch transaction manager we need to annotate batch transaction manager with @Primary which we saw in previous section of spring batch data source configuration.
@Configuration
public class CloudTaskConfig {
    @Bean
    public TaskConfigurer taskConfigurer(@Qualifier("batchDataSource") DataSource batchDataSource){
        return new DefaultTaskConfigurer(batchDataSource);
    }

}
Other posts you may like to explore:
Spring batch job with spring cloud data flow server
How to setup spring cloud data flow server

Comments

  1. This is really nice post, I found and love this content. I will prefer this, thanks for sharing. Business Intelligence Data Analytics.

    ReplyDelete
  2. I followed you, but its not creating Spring Batch metadata tables automatically

    ReplyDelete
  3. exactly what i was looking for, thank you

    ReplyDelete
  4. I'm facing Caused by: org.hsqldb.HsqlException: user lacks privilege or object not found: BATCH_JOB_INSTANCE problem. What am I missing here.

    ReplyDelete
    Replies
    1. There could be several reasons like user for batch doesn't have permission in configured database or it is connecting to incorrect database in case multiple database configured in your application.

      Delete

  5. It was such a good post. Visit Hack App Data Pro Apk Download. Thanks for sharing.

    ReplyDelete
  6. Thank you so much for imparting this excellent information to us. Anyone can benefit from this information. Therefore, kindly impart this kind of wisdom to everyone. Thanks. Information about.best pdf signature api

    ReplyDelete
  7. The blog consistently posts new content, and it is a fascinating blog. Thank you for revealing.hire programmers in india

    ReplyDelete
  8. This is a great article you've shared because it offers some great information that will be very helpful to me. I'm grateful that you said that. Work well done; continue.hire coldfusion developers

    ReplyDelete
  9. Hey i read your blog and find it very helpful for me because i was looking for these type of content online.

    Reasons Why To Hire A Bodyguard in London, UK:| Spetsnaz Security International Fidel Matola

    https://www.spetsnazsecurityinternational.co.uk/reasons-to-hire-a-london-bodyguard.html



    https://www.spetsnazsecurityinternational.co.uk/reasons-to-hire-a-london-bodyguard.html

    ReplyDelete
  10. Elevate your programming skills with the leading "Java Course in Gurgaon" offered by APTRON. As a premier training institute, APTRON Gurgaon provides comprehensive and hands-on Java programming courses that cater to both beginners and experienced developers.

    ReplyDelete
  11. 27037Natasha29D1814 April 2024 at 11:38

    1678A
    ----
    ----
    ----
    ----
    ----
    matadorbet
    ----
    ----
    ----

    ReplyDelete
  12. D09A75DD70Alyssa596F2C37FF12 October 2024 at 20:40

    081F707CDA
    ucretli show

    ReplyDelete
  13. 9134379F09Reed794F29D20825 November 2024 at 19:31

    F4620D2A8F
    canli cam show

    ReplyDelete

Post a Comment

Popular Posts

Setting up kerberos in Mac OS X

Kerberos in MAC OS X Kerberos authentication allows the computers in same domain network to authenticate certain services with prompting the user for credentials. MAC OS X comes with Heimdal Kerberos which is an alternate implementation of the kerberos and uses LDAP as identity management database. Here we are going to learn how to setup a kerberos on MAC OS X which we will configure latter in our application. Installing Kerberos In MAC we can use Homebrew for installing any software package. Homebrew makes it very easy to install the kerberos by just executing a simple command as given below. brew install krb5 Once installation is complete, we need to set the below export commands in user's profile which will make the kerberos utility commands and compiler available to execute from anywhere. Open user's bash profile: vi ~/.bash_profile Add below lines: export PATH=/usr/local/opt/krb5/bin:$PATH export PATH=/usr/local/opt/krb5/sbin:$PATH export LDFLAGS=...

Why HashMap key should be immutable in java

HashMap is used to store the data in key, value pair where key is unique and value can be store or retrieve using the key. Any class can be a candidate for the map key if it follows below rules. 1. Overrides hashcode() and equals() method.   Map stores the data using hashcode() and equals() method from key. To store a value against a given key, map first calls key's hashcode() and then uses it to calculate the index position in backed array by applying some hashing function. For each index position it has a bucket which is a LinkedList and changed to Node from java 8. Then it will iterate through all the element and will check the equality with key by calling it's equals() method if a match is found, it will update the value with the new value otherwise it will add the new entry with given key and value. In the same way it check for the existing key when get() is called. If it finds a match for given key in the bucket with given hashcode(), it will return the value other...

Entity to DTO conversion in Java using Jackson

It's very common to have the DTO class for a given entity in any application. When persisting data, we use entity objects and when we need to provide the data to end user/application we use DTO class. Due to this we may need to have similar properties on DTO class as we have in our Entity class and to share the data we populate DTO objects using entity objects. To do this we may need to call getter on entity and then setter on DTO for the same data which increases number of code line. Also if number of DTOs are high then we need to write lot of code to just get and set the values or vice-versa. To overcome this problem we are going to use Jackson API and will see how to do it with minimal code only. Maven dependency <dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-databind</artifactId> <version>2.9.9</version> </dependency> Entity class Below is ...