Bash script using `find` command with `-exec` scenarios to process file names with newlines
After trying multiple solutions online, I still can't figure this out. I'm a bit lost with I'm working with a question with my Bash script that uses the `find` command in combination with `-exec`... The intended behavior is to search for all files in a directory and execute a specific command on each of them. However, when file names contain newlines, the command fails unexpectedly. Here's the relevant portion of my script: ```bash #!/bin/bash OUTPUT_DIR="./output" mkdir -p "$OUTPUT_DIR" find ./input -type f -exec sh -c 'echo "Processing: $1"; cp "$1" "$OUTPUT_DIR"/' sh {} \; ``` This script is supposed to copy all files from `./input` to `./output`, while echoing the name of each file being processed. However, when a file name like `file name.txt` exists, it breaks the command execution, resulting in an behavior: `cp: want to stat 'file': No such file or directory`. I've tried using `-print0` with `xargs -0`, but it doesnβt seem to solve the scenario either. Here's what I attempted: ```bash find ./input -type f -print0 | xargs -0 -I {} sh -c 'echo "Processing: {}"; cp "{}" "$OUTPUT_DIR"' ``` This also fails when working with a file with a newline in its name, and the output just hangs without processing any files correctly. I've verified that files with newlines do exist using the `ls` command and they are being found. Is there a robust way to handle file names with newlines in Bash when using `find` and `-exec`? Any insights or alternative solutions would be greatly appreciated! I'm using Bash latest in this project. The stack includes Bash and several other technologies. Any ideas what could be causing this?