[hibernate-issues] [Hibernate-JIRA] Commented: (HHH-4042) StatelessSession does not flush when using jdbc batch_size > 1

Gael Beaudoin (JIRA) noreply at atlassian.com
Wed Jan 6 11:43:29 EST 2010


    [ http://opensource.atlassian.com/projects/hibernate/browse/HHH-4042?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=35117#action_35117 ] 

Gael Beaudoin commented on HHH-4042:
------------------------------------

Well, maybe it was not designed for this purpose, I can understand it's not optimized or something like that. But in this case, there is potential data loss (!).

Moreover, the documentation page you link is titled "Batch processing" ... are you sure it's not designed for this purpose ?! And I'm pretty sure this is where I've borrowed some idea when dealing with lots of inserts : stateless session is way faster and memory efficient than using an entity manager.

Quoting the documentation page : "Alternatively, Hibernate provides a command-oriented API that can be used for streaming data to and from the database in the form of detached objects. ". I'm clearly not out of bound here, or the documentation should be updated.

The documentation mentions that insert() is a direct database operation : this is what I use, a simple insert. But it fails because of a global setting (hibernate.jdbc.batch_size).

Either make it not obey the batch size settings, either fix the wrong behaviour, but leaving it like this does not seam, IMHO, a good decision.

> StatelessSession does not flush when using jdbc batch_size > 1
> --------------------------------------------------------------
>
>                 Key: HHH-4042
>                 URL: http://opensource.atlassian.com/projects/hibernate/browse/HHH-4042
>             Project: Hibernate Core
>          Issue Type: Bug
>          Components: core
>    Affects Versions: 3.3.1
>         Environment: JBoss 4.2.3, Linux, java 1.6, hibernate 3.3.1, entityManager 3.4.0, jboss seam 2.1.2, postgresql 8.3
>            Reporter: Gael Beaudoin
>
> I'm using a StetelessSession to insert millions of rows : it works great and without using much memory. But I've just seen that with a jdbc batch size of 50 for example (<property name="hibernate.jdbc.batch_size" value="0"/> in my persistence.xml) the last round of inserts aren't flushed to the database. For example, with 70 insert, only the first 50 are sent to the database.
> I've searched a lot about this issues and on this thread (https://forum.hibernate.org/viewtopic.php?f=1&t=987882&start=0), the only solution found is to set the batch_size to 1, which is really a shame.
> I've tried to flush the session, close the jdbc connection, etc etc ... no luck.
> I'd be fine with a way to set the batch_size to 1 only for this method, pro grammatically, but I've not found any way to do that.
> If you don't pay attention, it's a easy way to lose data.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: http://opensource.atlassian.com/projects/hibernate/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira

        


More information about the hibernate-issues mailing list