Link to home
Start Free TrialLog in
Avatar of jksung
jksung

asked on

SpringBoot repository find response time increases with many concurrent calls, while manual database search not affected

I am trying to improve the response time of my REST APIs (I am using SpringBoot 2.1.3 with Groovy 2.4.14 and MSSQL). I noticed that the more popular GET API are at certain time periods taking much longer than they should be (>4 seconds as opposed to 0.3 seconds). I've looked into CPU usage, memory, blocking threads, blocking DBs, DeferredResult, fetching schemes, SpringBoot and JPA settings, etc none of these were a bottleneck or were just not relevant (the database search is a simple repository.findById() for a domain object with a few primitive fields).
List<Object> getObjectForId(String id) {    curCallCount++    List<Object> objList = objectRepository.findAllById(id)    curCallCount--    objList }

Open in new window

The issue seems to be that the more existing calls to the service that have not exited at the time of the call, the longer the response time of the API call (there is almost a linear correlation, if there are 50 existing calls to the service, repository.findbyId() takes 5 seconds, and if there are 200, it would take 20 seconds. Meanwhile while there are 200 concurrent calls, the manual database query is still fast (0.3 seconds).
Is this expected for the Spring repository calls? Where is this overhead from repository.findById() coming from in an environment when there are many concurrent calls to the service, even though the manual database search performance is not affected?
Avatar of David Johnson, CD
David Johnson, CD
Flag of Canada image

I would look at the SQL Server end as that seems to be the bottleneck.
Avatar of jksung
jksung

ASKER

Thanks, after enabling hibernate statistics, it seems the more connections in the service function, the longer it takes to obtain the JDBC connection:

1 concurrent call to the service function:
17:32:29.436 [QUIET] [system.out] 2020-10-08 17:32:29.436  INFO --- [nio-8086-exec-9] i.StatisticalLoggingSessionEventListener : - Session Metrics {
17:32:29.436 [QUIET] [system.out]     94033720 nanoseconds spent acquiring 1 JDBC connections;
17:32:29.436 [QUIET] [system.out]     0 nanoseconds spent releasing 0 JDBC connections;
17:32:29.436 [QUIET] [system.out]     94396274 nanoseconds spent preparing 1 JDBC statements;
17:32:29.437 [QUIET] [system.out]     325896065 nanoseconds spent executing 1 JDBC statements;

80 concurrent calls to the service function:
17:41:29.934 [QUIET] [system.out] 2020-10-08 17:41:29.935  INFO --- [io-8086-exec-58] i.StatisticalLoggingSessionEventListener : - Session Metrics {
17:41:29.935 [QUIET] [system.out]     7180991368 nanoseconds spent acquiring 1 JDBC connections;
17:41:29.935 [QUIET] [system.out]     0 nanoseconds spent releasing 0 JDBC connections;
17:41:29.935 [QUIET] [system.out]     7181027023 nanoseconds spent preparing 1 JDBC statements;
17:41:29.935 [QUIET] [system.out]     1759766484 nanoseconds spent executing 1 JDBC statements;

Meanwhile the manual query is still fast and takes no time to connect to the database.

Is there a reason why the time to acquire the JDBC connection is so linearly sensitive to the number of concurrent calls to the service that is making the repository.findById() call, when the manual query is not affected?  I've tried specifying a large value for server.tomcat.max-threads and server.tomcat.max-connections, but the same result, the more concurrent calls to repository.findById(), the longer it takes to get the JDBC connection, even though the manual query call is fast and doesn't take any time to get the db connection.
Avatar of jksung

ASKER

The issue was the Hikari pool size being too small (10 by default, so that when there are more than 10 concurrent calls, processes must wait for a free thread). Increasing this (to 150, for example) resolved this issue.
This question needs an answer!
Become an EE member today
7 DAY FREE TRIAL
Members can start a 7-Day Free trial then enjoy unlimited access to the platform.
View membership options
or
Learn why we charge membership fees
We get it - no one likes a content blocker. Take one extra minute and find out why we block content.