PowerShell 7.3 - implementing Combining Get-ChildItem and Export-Csv for Large Dataset
I've been struggling with this for a few days now and could really use some help. I'm relatively new to this, so bear with me. I am trying to export a list of files from a directory and its subdirectories to a CSV file using PowerShell 7.3. However, when the number of files is large (over 10,000), the script runs into memory issues and eventually stops without completing the task. My current approach uses `Get-ChildItem` to retrieve the files and `Export-Csv` to write them to a CSV file. Hereโs the code Iโm using: ```powershell $directoryPath = 'C:\MyDirectory' $outputPath = 'C:\MyOutput\files.csv' Get-ChildItem -Path $directoryPath -Recurse -File | Select-Object FullName, LastWriteTime, Length | Export-Csv -Path $outputPath -NoTypeInformation -Encoding UTF8 ``` When I run this, it starts fine but eventually I get an behavior stating: `OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.` I tried increasing the available memory on the machine, but that didnโt help. I also attempted to streamline the data selection by filtering out certain file types using `Where-Object`, but I ended up with the same result. Hereโs what I tried: ```powershell Get-ChildItem -Path $directoryPath -Recurse -File | Where-Object { $_.Extension -ne '.tmp' } | Select-Object FullName, LastWriteTime, Length | Export-Csv -Path $outputPath -NoTypeInformation -Encoding UTF8 ``` Is there a more memory-efficient way to handle this export, or perhaps a different method to write to CSV that can manage larger datasets? Any guidance on this would be greatly appreciated! This is part of a larger API I'm building. Am I missing something obvious? I'm working on a API that needs to handle this. How would you solve this?