CodexBloom - Programming Q&A Platform

How can I optimize the performance of a Java application using Hibernate with large datasets?

πŸ‘€ Views: 98 πŸ’¬ Answers: 1 πŸ“… Created: 2025-06-15
hibernate postgresql performance Java

I am currently working on a Java application using Hibernate (version 5.4) for ORM with a PostgreSQL database, and I'm facing serious performance issues when dealing with large datasets (over 1 million records). My queries are taking too long, and the application becomes unresponsive during data fetch operations. I've tried using pagination with `setFirstResult()` and `setMaxResults()`, but it only improves the situation marginally. Here’s the code snippet I am using for fetching data: ```java List<MyEntity> results = session.createQuery("FROM MyEntity", MyEntity.class) .setFirstResult(page * pageSize) .setMaxResults(pageSize) .getResultList(); ``` Additionally, I've enabled Hibernate's second-level cache and query cache, but I don't see any noticeable change in performance. The application still takes about 20 seconds to load a page with 100 records. I'm considering the use of projections to fetch only the needed fields, but I'm not sure how to implement that properly in this context. Here’s another approach I tried using DTOs: ```java List<MyDTO> results = session.createQuery("SELECT new com.example.MyDTO(e.field1, e.field2) FROM MyEntity e", MyDTO.class) .setFirstResult(page * pageSize) .setMaxResults(pageSize) .getResultList(); ``` However, I still notice that query execution times are far from optimal. Is there a better strategy to fetch large datasets efficiently or any specific configurations I should consider in Hibernate or PostgreSQL to optimize performance? Any help would be greatly appreciated!