OCI Load Balancer Not Distributing Traffic Evenly Across Instances in Java Application
I'm trying to configure I've been struggling with this for a few days now and could really use some help. I'm working with a question with the OCI Load Balancer as it seems to not distribute incoming traffic evenly across the available backend instances. I have set up a load balancer with two backend server instances running a Java Spring Boot application. Each instance is configured identically and there are no health check issues reported. However, when I generate traffic using a load testing tool, I observe that one instance is handling the majority of the requests while the other remains under-utilized. I've configured the load balancer with a round-robin policy, but it still appears to favor one backend over the other. Here's a snippet of my load balancer configuration: ```yaml loadBalancer: name: "my-load-balancer" backendSets: my-backend-set: backends: - ipAddress: "10.0.0.1" port: 8080 - ipAddress: "10.0.0.2" port: 8080 policy: "ROUND_ROBIN" healthChecker: protocol: "HTTP" urlPath: "/health" port: 8080 returnCode: "200" ``` To troubleshoot, I verified that both instances are up and healthy and they're responding correctly to health checks. I've also checked the application logs on both instances for any discrepancies but didn’t find anything unusual. In my load testing, I'm simulating 1000 concurrent users hitting the load balancer endpoint. The load testing tool shows that one instance is responding to about 800 requests, while the other is responding to only 200. Is there something I might be missing in the configuration of the load balancer, or is there a known limitation or behavior that could explain why the traffic isn't being distributed as expected? Any insights or advice would be greatly appreciated! Any ideas what could be causing this? This is part of a larger API I'm building. Any help would be greatly appreciated! This is part of a larger microservice I'm building. Could this be a known issue?