Inconsistent Results from Greedy Coin Change Algorithm in Java - Handling Edge Cases
I just started working with I'm working on a personal project and I need help solving I'm working on a personal project and I'm stuck on something that should probably be simple..... I'm implementing a greedy algorithm for the coin change question in Java, but I'm working with inconsistent results when testing with certain input values. The algorithm is designed to return the minimum number of coins needed to make a certain amount using the available denominations. Here's the code snippet I have so far: ```java import java.util.Arrays; public class CoinChange { public static int minCoins(int[] coins, int amount) { Arrays.sort(coins); int coinCount = 0; for (int i = coins.length - 1; i >= 0; i--) { while (amount >= coins[i]) { amount -= coins[i]; coinCount++; } } return amount == 0 ? coinCount : -1; } } ``` When I test this with `coins = {1, 3, 4}` and `amount = 6`, I expect to get `2` (using two coins of `3`), but it returns `3` (one `3` and three `1`s). I suspect it might be related to how the greedy choice doesn't always yield an optimal solution. I also tried adding a check to break out of the loop early when the remaining amount is zero, but that didn't resolve the scenario. I read that the greedy algorithm can unexpected result in certain cases, particularly when the denominations don't lend themselves to an optimal solution. Should I consider a dynamic programming approach instead? Any insights on how I can handle this specific edge case or refactor my code to improve it would be appreciated! Any help would be greatly appreciated! I'm working on a service that needs to handle this. I'm developing on CentOS with Java. For context: I'm using Java on Windows 11. I'm working in a Windows 10 environment. Cheers for any assistance!