I have multiple files in a Linux systems where I want to copy them with a single cp command into a different path and directory. Should I write a bash script to copy one by one?
https://www.zylk.net/en/web-2-0/blog/-/blogs/how-to-copy-files-in-linux-faster-and-safer-than-cp could help...
Use rsync?
I've never needed a bash script to do these.
use `cp -r` to copy files and folders recursively.
cp -r my-folder/* /to/some/other/folder/
Hint - if you use `man cp` it will bring up the help file for that command. In fact you can use `man` before any command and it will bring up a help manual. Example `man ls`.
If you want to copy a folder from one system to another, I usually use `SCP -r` or I first tar.gz the contents on the source and then copy that one file to the other system and then untar at the destination. But you can use FTP with Curl or rsync. What I tend to do is find out how I do something, then I make a note in my personal wiki pages. Then next time I need to do that I check my own documentation. I've got to the point now that I have personal documentation for most things that I need to do and if I dont then I search for it, test it and then add it to the docs :twitch: :mrgreen:
No. You shouldn't write a bash script. Just use the cp command bulk copy feature and provide all files you want to copy and add the path which is the destination.
cp file1 file2 file3 /mnt/backup
Referece: https://www.networking-forums.com/programming-goodies-and-software-defined-networking/copy-bulk-files-in-linux