PowerShell 7.3 - Difficulty Aggregating Multiple Log Files with Dynamic File Names
I'm trying to implement I've been banging my head against this for hours. I need help solving I'm trying to aggregate multiple log files from a directory, but the filenames are dynamic and change daily based on the date... I'm using PowerShell 7.3 and have tried to utilize `Get-ChildItem` with a wildcard to match the log files, but I keep running into issues when the file names don't match expected patterns. I've used the following code: ```powershell $logFiles = Get-ChildItem -Path 'C:\Logs' -Filter '*.log' | Where-Object { $_.LastWriteTime -gt (Get-Date).AddDays(-1) } $aggregatedContent = foreach ($file in $logFiles) { Get-Content $file.FullName } $aggregatedContent | Set-Content 'C:\Logs\AggregatedLog.txt' ``` This works fine when there are log files present, but when there are none, I receive an behavior: `Get-Content : want to find path 'C:\Logs\<filename>' because it does not exist.` I want to ensure that the script handles this gracefully without breaking. Additionally, I noticed that if there are too many files, the script can become slow due to the amount of data being read. Is there a more efficient way to do this, perhaps by reading the files in parallel or by filtering the logs based on a specific string pattern? I've considered using `ForEach-Object -Parallel`, but I'm not sure how to implement this correctly with dynamic file names. Any help on making this more robust and efficient would be greatly appreciated! Any ideas what could be causing this? How would you solve this? My team is using Powershell for this mobile app. Has anyone else encountered this? This issue appeared after updating to Powershell LTS. Hoping someone can shed some light on this.