May 27, 2016

Pass arguments to your main method in gradle bootRun

If you are tired of Spring Boot configuration magic, there is one more trick to confuse you completely.
How to configure Spring to work without knowing profile or environment? Dropwizard style, like this:
java -jar myjar.jar myconfig.yml.
No profiles, no wandering how did my properties got populated.

Spring Boot first class configuration is Java bean. Unfortunately, due to legacy issues, we need to include xml config. After some pain, suffering and reading, I found solution:
public class BatchConfiguration 

In similar way you can include properties file.
Next step is ${properties}. In xml configuration there are some environment-specific properties, like database url and so on. Some advise JNDI, but it introduces one more layer of magic - configuration of a web container.
So, question is, how to populate properties with external config. There are a ton of solutions out there, but none worked for me. Maybe because it's Spring Boot+Spring Data+Spring Batch or maybe because I don't understand this page. Anyway, I found my own way:
public static void main(String[] args) throws IOException {
        if (args.length >= 1) {
            Properties p = new Properties();
            p.load(new FileReader(args[0]));
            p.forEach((x, y) -> {
                System.setProperty((String) x, (String) y);
        }, args);
Now it works perfectly in fatJar task, but what about bootRun? Boot Run saves a lot of dev time and is very easy to use. As you probably know, BootRun extends gradle standard JavaExec task, with awesome parameter args. It goes straight to your main method. So gradle bootRun task looks like:
bootRun {
    args =[""]
This is all for make your solution work! Have fun.
P.S. HowTo pass jvmArgs
P.P.S. HowTo run job from Controller

May 25, 2016


Spring Batch is a nice choice for simple ETL jobs, but it doesn't work well with mongodb, especially writing to it. Provided in Spring Batch MongoItemWriter doesn't do bulk inserts.
Fortunately for us, bulk inserts are quite easy to implement:
import com.mongodb.BulkWriteOperation;
import com.mongodb.BulkWriteResult;
import com.mongodb.DBObject;
import org.springframework.batch.item.ItemWriter;

import java.util.List;

public class MongoBulkItemWriter<T> implements ItemWriter<T> {

    private String collection;
    private MongoTemplate template;

    public MongoBulkItemWriter(String collection, MongoTemplate mongoTemplate) {
        this.collection = collection;
        this.template = mongoTemplate;

    public void write(List items) throws Exception {
        BulkWriteOperation bulk = template.getCollection(collection).initializeUnorderedBulkOperation();
                bulk.insert((DBObject) template.getConverter().convertToMongoType(i));
        BulkWriteResult result = bulk.execute();
It works much faster, but beware - inserts only, so item with duplicate id will ruin your batch. Solution might look something like this:
        BulkWriteOperation bulk = template.getCollection(COLLECTION_NAME).initializeUnorderedBulkOperation();
        updates.forEach(u -> {
            bulk.find(new BasicDBObject("id", u.getId())).upsert().update(u.getDbObject());
Upserts are much slower than pure inserts, but still a huge win compared with per object writes.