SCP command line input issue - shell

I created a SCP script where it will take the source directory input from command line, but I see an issue with that, for example, If I want to copy 5 files under /var/lib directory then in my command line If I do /var/lib/* then each file is taken as a separate command line argument and my script fails, I can't even copy the whole lib directory because in destination I do not want that folder to be created. Any idea would help, thanks..!!

Related

Command not found in ssh but is in direcory

I am using an ssh account that connects to an external server, i have downloaded through guix some software like samtools and bedtools but when i try to use them in my directory it gives me this error:
-bash: samtools: command not found
In my direcory, however, there is the directry guix.profile and if I go into the bin folder of this, I have everything I downloaded.
What am I doing wrong?
Thank you
enter image description here
To run a file from the shell you need two things:
The shell must find the file
Being in the same directory does not enable the shell to find the file. You have to either supply an absolute or relative path the file, or have the directory in your PATH environment variable
In simplest terms this means instead of
$ samtools
try
$ ./samtools
The relative path tells the shell it wants that file
To run it from another directory, either use the whole absolute path, e.g. /home/yourname/samtools , or move the file into a directory that is on your $PATH
The file needs to be executable
If the file is not executable you will need
$ chmod +x ./samtools

Script for copying a folder somewhere using Git Bash

I want to copy a folder ~/Projects/LocalProject onto my server //VM-Server/ServerProject.
I know that I can use GitBash:
cp -r directory-name-1 directory-name-2
But what I'm curious about is, can I create a script to do that by double clicking that script, or adding it as a command to my GitBash, cause I will need that alot?
--Edit--
Tried nothing, as I don't know how to do that. Yes there are hidden files, I don't want them to be copied. There shouldn't be newer files on the destination. I need to manually run it, I thought that's clear as I mentioned the option to have a executable script / or a terminal command.
Option 1: Batch file
You don't even need git-bash; you can make a batch file in any text editor, name it copy to server.bat, and type in cp C:\Users\<Your username>\Projects\LocalProject \\VM-Server\ServerProject.
You can also make a .sh file for use in bash. The command is the same, just make note that Windows uses \, while bash uses / for directory tree
Option 2: Alias
Open your bash_profile file (it's in your git bash install location).
Add a line at the end of the file that says alias copyToServer = 'cp ~/Projects/LocalProject //VM-Server/ServerProject'. Then close git-bash, reopen it and use the command by typing copyToServer as a bash command. (It doesn't need to be named copyToServer)

Executing a bash script from anywhere on Windows

I am on Windows.
I have a script file named basics.sh and here is what it contains:
cd opt-out-exam/abduvosid_malikov/IT
mkdir made_by_my_script
cd made_by_my_script
echo "Hello World" > hello.txt
so basically, basics.sh script file is responsible to:
go to folder opt-out-exam/abduvosid_malikov/IT
make a directory made_by_my_script
create hello.txt file with content Hello World
Right now. to execute this basics.sh script, I am going to IT folder and writing this command in the terminal:
./basics.sh
In order to execute this basics.sh script, is it compulsory for me to go to IT folder
OR
is it possible to execute this script file even if I am staying in another folder (lets say currently working directory is opt-out-exam)
The first line is a change directory command followed by a relative path, not absolute. In such cases, it is important where you run the script. (An absolute path would start with the filesystem root, i. e. /.)
If you run this script from a directory (I wouldn't call it a folder in this context) where the relative path opt-out-exam/abduvosid_malikov/IT does not exist, it won't cd into it. But it will make a new directory without any problem, it will also create the file and write a line into it.
So only the first line will fail if it's run somewhere else.
UPD: As Gordon Davisson pointed out, this means that you want to check whether the directory change actually took place or not.

Taking back up of files while using Copy & Paste via command line

I am using the command cp -a <source>/* <destination> for copying and pasting the files inside one particular destination. In the destination the above command only replaces the files inside a folder that is present in source as well. If the there are other files present in destination, the command will not do anything and leave as it is. Now before doing the pasting, I want to take the back up of the files that are about to be replaced with the copy paste. Is there an option in the cp command that does this?
There is no such option in cp command. Here you need to create a shell script. First execute a ls command in your destination directory and store the output in a file like history.txt. Now just before cp command execute a grep command with the file you want to copy in the history file to check whether that file is already available in history file or not. If the file is available in destination directory (that means file available in history file) back up the file in destination directory first with todays datestamp and then copy the same file name from source to destination.
If you want to backup these files that will be copied from source, use -b option, available in GNU cp
cp -ab <source>/* <destination>
There is 2 caveats that you should know about.
This command, in my knoledge, is not available in non GNU
system (like BSD systems)
It will ask for confirmation for each existing file in target. We can reduce the probleme with the -u option but this is unusable in a script.
It appears to me that you are trying to make a backup (copy files to another location, don't erase them, don't overwrite those already in them), you probably want to take a look at the rsync command. This same command would be written
rsync -ab --suffix=".bak" <source>/ <destination>
and the rsync command is much more flexible to handle this sort of things.

Create a bash script that runs and updates a log whenever a file is deleted

I am new to bash scripting and I have to create a script that will run on all computers within my group at work (so it's not just checking one computer). We have a spreadsheet that keeps certain file information, and I am working to automate the updating of that spreadsheet. I already have an existing python script that gathers the information needed and writes to the spreadsheet.
What I need is a bash script (cron job, maybe?) that is activated anytime a user deletes a file that matches a certain extension within the specified file path. The script should hold on to the file name before it is completely deleted. I don't need any other information besides the name.
Does anyone have any suggestions for where I should begin with this? I've searched a bit but not found anything useful yet.
It would be something like:
for folders and files in path:
if file ends in .txt and is being deleted:
save file name
To save the name of every file .txt deleted in some directory path or any of its subdirectories, run:
inotifywait -m -e delete --format "%w%f" -r "path" 2>stderr.log | grep '\.txt$' >>logfile
Explanation:
-m tells inotifywait to keep running. The default is to exit after the first event
-e delete tells inotifywait to only report on file delete events.
--format "%w%f" tells inotifywait to print only the name of the deleted file
path is the target directory to watch.
-r tells inotifywait to monitor subdirectories of path recursively.
2>stderr.log tells the shell to save stderr output to a file named stderr.log. As long as things are working properly, you may ignore this file.
>>logfile tells the shell to redirect all output to the file logfile. If you leave this part off, output will be directed to stdout and you can watch in real time as files are deleted.
grep '\.txt$' limits the output to files with .txt extensions.
Mac OSX
Similar programs are available for OSX. See "Is there a command like “watch” or “inotifywait” on the Mac?".

Resources