Introduction of Batch processing by R4R Team

Batch processing: A naive approach to inserting 100,000 rows in the database using Hibernate.

We can understand by the small code of example:


By the help of the given below code we can understand the what need to do in the batch processing:


Session session = sessionFactory.openSession();

Transaction tx = session.beginTransaction();

for ( int i=0; i<100000; i++ ) {

    Customer customer = new Customer(.....);

    session.save(customer);

}

tx.commit();

session.close();


By this we fall over with an OutOfMemoryException somewhere around the 50,000th row. That is because Hibernate caches all the newly inserted Customer instances in the session-level cache. In this chapter we will show you how to avoid this problem.

If undertaking batch processing you will need to enable the use of JDBC batching. This is absolutely essential if you want to achieve optimal performance. Set the JDBC batch size to a reasonable number (10-50)


Example:


hibernate.jdbc.batch_size 20


Hibernate disables insert batching at the JDBC level transparently if you use an identity identifier generator.


we can do these kind of work in a process where interaction with the second-level cache is completely disabled:


hibernate.cache.use_second_level_cache false


This is not absolutely necessary, since we can explicitly set the CacheMode to disable interaction with the second-level cache.

Leave a Comment:
Search
Categories
R4R Team
R4Rin Top Tutorials are Core Java,Hibernate ,Spring,Sturts.The content on R4R.in website is done by expert team not only with the help of books but along with the strong professional knowledge in all context like coding,designing, marketing,etc!