Im trying to do a full back up and copy over all files from one directory into another directory
#!/bin/bash
#getting files from this directory
PROJECTDIRECTORY=/../../Project3
#Copied to this directory
FILEBACKUPLOCATION= /../.../FMonday
for FILENAME in $PROJECTDIRECTORY/*
do
cp $FILENAME $FILEBACKUPLOCATION
done
But I keep getting this error
./fullbackup: line 6: /../../FMonday: Is a directory
cp: missing destination file operand after
Variable assignments in bash scripts require no space between the variable name and value or (unless quoted) within the value. Since a space was present in the line FILEBACKUPLOCATION= /../.../FMonday, it attempted to execute /../.../FMonday as a command (which caused the first error) with FILEBACKUPLOCATION assigned to an empty string. The variable was then not assigned when the other command further down tried to use it (accounting for the second error).
I think this is what you need is FILEBACKUPLOCATION=/FMonday/
In my case, I just used used space and dot operator after the file name. It worked perfectly.
For Example, darthVader is a file name which is inside F drive. So, I used the command "mv /f/darthVader . "
Has you already found the variable needs to be FILEBACKUPLOCATION=/location with no space, also if possible try to make your script a little more portable by using something like:
FILEBACKUPLOCATION=$HOME/backup/FMonday
In this case, the location is relative to your user $HOME environment variable.
Going further, if you want to keep 2 files in sync probably you could do better with rsync one of the advantages is that it will only copy the files that don't exist or update the ones that have been changed.
You indeed could create an alias to copy/sync files on demand, for example:
alias cpr="rsync --delete --archive --numeric-ids --human-readable --no-compress --whole-file --verbose --info=progress2"
Then if you would like to sync contents of /dir/foo into /dir/bar, could simply do:
$ cpr /dir/foo/ /dir/bar/
Your script also could then be something like:
#!/bin/sh
ORIGIN=/path/to/foo/
DESTINATION=/path/to/bar/
rsync --delete \
--archive \
--numeric-ids \
--human-readable \
--no-compress \
--whole-file \
--verbose \
--info=progress2 \
$ORIGIN $DESTINATION
Notice that in this case the options --no-compress and --whole-file are used, mainly because the files will be copied locally, if this where a remote server options could be different, check this post (The fastest remote directory rsync over ssh archival I can muster (40MB/s over 1gb NICs)
Related
I'm trying to make a bash expect script that takes in input like user, host, password, and file names and then copies the files from remote to local. From what I've read so far, scp'ing multiple files from remote to local works just fine when you're assuming the files are coming from ~/, i.e:
scp -r user#host:"file1 file2 file3" .
would copy file1, file2, and file3 from ~/ into the current working directory. But I need to be able to pass in another directory as an argument, so my bash script looks like this (but doesn't work, I'll explain how):
eval spawn scp -oStrictHostKeyChecking=no -oCheckHostIP=no -r $user#$host:$dir/"$file1 $file2 $file3" ~/Downloads
This doesn't work after the first file; the shell raises a "No such file or directory" error after the first file, which I would assume means that the script only works on $dir for the first file, then kicks back into ~/ and of course can't find the files there. I've looked everywhere for an answer on this but can't find it, and it would be super tedious to do this one file at a time.
Assuming your remote login shell understands Brace Expansion, this should work
scp $user#$host:$dir/"{$file1,$file2,$file3}" ~/Downloads
If you want to download multiple files with a specific pattern, you can do the following for example if you want all zip files:
scp -r user#host:/path/to/files/"*.zip" /your/local/path
Suppose I have a folder named my_folder_old in /path/to/folder, how can I create a duplicate named my_folder_new in the same directory?
EDIT
Moreover if my_folder_new already exists, my_folder_old is created inside the first and not substituted. Why is this happening?
Tutorial copy files, folder link: link
Manual cp command : Link
cp -frp /path/to/folder/my_folder_old -T /path/to/folder/my_folder_new
-f, --force
if an existing destination file cannot be opened, remove it
and try again (this option is ignored when the -n option is
also used)
-p same as --preserve=mode,ownership,timestamps
-R, -r, --recursive
copy directories recursively
-T, --no-target-directory
treat DEST as a normal file
Though if my_folder_new already exists, my_folder_old is created inside the first and not substituted. Why is this happening?
The reason why is this happening because, my_folder_new already created. Doing same cp command it will see as new path, /path/to/folder/my_folder_new/
I was dealing with this same issue, was going crazy ahaha, I tried cp -frp but did not work, so, before of going to do cp just remove the existing folder using rm, see below more info about this:
Remove Directory Linux
If a directory or a file within the directory is write-protected, you will be prompted to confirm the deletion. To remove a directory without being prompted, use the -f option:
rm -rf dir1
I am trying to loop through all of the files in a directory and
move them to a workspace (I need to do this to do this because the workspace doesn't have much storage).
Run a program which produces an output directory that contains all the files I will want to work with in the future
Delete the original file from the workspace (to save space in the workspace)and
Move the output directory out of the workspace and back to the storage space
I am able to do this for each file singly (i.e. each line works if I actually use the name of the files), but I can't get the for loop to work. I am quite new to this, so I probably did something simple wrong.
Can anyone see where I am going wrong?
for i in path_to_files; do
#copy to home directory (from scratch)
cp $i .
#Run IDBA
idba_ud -l $i -o '$i'_out
#remove file from work directory (limited space)
rm $i
#copy out directory back to scratch
cp -r '$i'_out path_to_files
done
I keep getting an error that says
syntax error near unexpected token `cp'.
I have also tried replacing cp with copy and i/$i with file/$file with no luck.
If this is indeed POSIX compatible shell (your code looks suspiciously like that, but you haven't specified the actually used shell), then:
You should always quote filenames, in case it contains spaces or other weird characters:
But you should not use single-quotes, as this will prevent shell from expanding your variables.
when appending text to substituted variables, use ${} notation (e.g. if $i expands to "murgel", then ${i}foo will expand to "murgelfoo", whereas $ifoo will expand to "" (an empty string) if there is no variable ifoo)
Thus try:
filepath=/path/to/files
for i in "${filepath}"/*; do
#copy to home directory (from scratch)
cp "${i}" .
#Run IDBA
idba_ud -l "${i}" -o "${i}_out"
#remove file from work directory (limited space)
rm $i
#copy out directory back to scratch
cp "${i}_out"/* "${filepath}"/
done
When I copy something, I always forget the -R, then I have go all the way back to add it right after cp.
I want to add this to bash config files.
alias cp="cp -R"
I have not seen anything bad happen. Is it safe to do this?
The only thing I can think of that would cause unexpected behavior with the -R flag is that it doesn't work with wildcards.
What I mean is... for example you want to copy all mp3 Files in a directory and every subdirectory with: cp -R /path/*.mp3. Although -R is given it will not copy mp3s in the subdirectories of path - if there are any.
I wouldn't use aliases for changing the behaviour of normal commands. When you are in a different shell / another computer, the alias will be missing. When a friend wants to help you, he will not know what you did.
Once I had an alias rm="rm -i" and I performed rm *, while I just had changed shell with a su.
And sometimes you want to use cp without the -R option, will you remember to use /bin/cp in these cases (copy all files in the current dir to another location, but do not cp the subdirs)?
So I'm currently trying to write a few small scripts that allow me to manage my iTunes library of which I have clones on multiple OS X machines.
The basic idea is that I have a NAS holding a copy of the library that is used as an intermediate "master copy" since the machines holding the actually used copies aren't available all the time. If I want to update my old copy on machine B with the newer version from machine A, I'd then update the NAS copy based on machine A's current state, then update machine B from the updated NAS copy possibly at a later time.
The script files are located on the NAS within the same folder that also houses the iTunes directory. Since I'm mounting the NAS as a volume via AFP, I simply open a Finder window with the directory containing the scripts and drag'n'drop the script I want to use to a Terminal window for easy execution.
This is my attempt at the "update NAS from local copy" script:
rsync -avz --compress-level 1 --exclude 'Mobile Applications/*.ipa' --delete --delay-updates -n "$(echo $HOME | sed 's/ /\\ /g')/Music/iTunes" "$(dirname $0 | sed 's/ /\\ /g')"
(-n option of course only for testing the script)
Since there will be spaces in the paths I supply rsync with, I already figured out that I'd need to escape those somehow. I also know that the standard way to do that on OS X is to prepend all the spaces with a backslash, at least when manually typing paths in Terminal. But the code above still won't work – rsync complains that it cannot change to the directory I supplied, although the path it spits out in the error message seems to be perfectly fine and can be cd'd to, if you remove the double quotes around it first:
[...]
building file list ... rsync: change_dir "/Volumes/Macintosh\ HD/Users/Julian/Music" failed: No such file or directory (2)
done
[...]
If I remove the surrounding double quotes in the script itself, rsync seems to not honor the escaping backslashes at all and still treat the space following the backslash as a path separator:
[...]
building file list ... rsync: link_stat "/Volumes/Macintosh\" failed: No such file or directory (2)
rsync: change_dir "/Volumes/Macintosh HD/Users/Julian/HD/Users/Julian/Music" failed: No such file or directory (2)
done
[...]
And no, I can't work around the issue by shortening /Volumes/Macintosh\ HD/Users/Julian/Music to /Users/Julian/Music since this machine has multiple HDDs and / is not the same disk/partition as /Volumes/Macintosh\ HD. So I need to find a solution for this specific problem.
I'm seriously lost now.
Can anyone please explain to me what I need to change in order to have rsync recognize the paths correctly?
After messing around quite a bit more and finding this question, I managed to develop a working solution:
localpath=$HOME/Music/iTunes
remotepath=$(dirname $0)/
rsync -avz \
--compress-level 1 \
--exclude 'Mobile Applications/*.ipa' \
--delete \
--delay-updates \
-n \
"$localpath" \
"$remotepath"