英文:
Spring Batch JpaItemWriter performance
问题
我使用Spring Batch的JpaItemWriter来插入包含11500条记录的数据库数据(该表有8列),我以为它的性能会很好,但实际速度没有我预期的那么快。
使用每批100条记录的块大小,11500条记录大约需要60秒,然后我将块大小提高到10000,但仍然需要大约50秒。
是否有人可以建议更好的方式来通过Spring Batch进行批量插入或更新?
通过Spring Batch来插入或更新大量数据的更好方式。
英文:
I use spring batch JpaItemWriter to insert db with 11500 records(8 columns for that table), and I thought its performance will be good, but it was not as fast as I expected.
11500 records took about 60 seconds with chunk size 100, then I raise the size to 10000, but it still took about 50 seconds.
Could any one suggest better way to do bulk insert or update by spring batch?
a better way to insert or update large amount of data by spring batch
答案1
得分: 0
JpaItemWriter
将项目写入委托给 JPA EntityManager
。因此,该写入器的性能受到所使用的 JPA 实现性能的限制。Spring Batch 的做法是将项目分块,然后调用 entityManager.persist
或 entityManager.merge
(取决于 usePersist
参数)。
话虽如此,通常情况下,原始的 JDBC 插入操作比 JPA 执行得更好,但这取决于具体情况。你需要尝试这两种方法并比较结果。
英文:
The JpaItemWriter
delegates item writing to a JPA EntityManager
. So the performance of the writer is bound to the performance of the JPA implementation you use. What Spring Batch does is chunking items and then calling either entityManager.persist
or entityManager.merge
(depending on the usePersist
parameter).
That said, raw JDBC inserts usually perform better than JPA, but this depends on the case. You need to try both approaches and compare the results.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论