I am trying to schedule a mysqldump through SSIS and the only way I see possible is through the command line with Putty. My command looks something like this:
putty.exe user#host -pw password -m script.txt
Inside my script is my mysqldump command:
mysqldump -h host -u username -ppassword schema > dump_file.sql
When I test this with a dummy database, it works because the time it takes to complete is less than a minute, so it would run the command, open up a shell, execute the script, and exit the shell after it is done.
My issue now is with a live database, it takes longer and my connection eventually times out and the connection is lost and the shell never closes (even though the dump is successful). So this causes my SQL Server Agent job to be hanging and never ends.
Is there a way that I can run the script and then exit the shell without waiting for it to finish first?
Try running the mysqldump command in background by using &.
nohup mysqldump -h host -u username -ppassword schema > dump_file.sql &
Related
What I am trying to do is to login to remote Linux server using SSH from my Windows machine by running a shell script via git bash.
I would like to write a script which will be used by an user with basic IT knowledge. This script will execute a bunch of commands on the remote machine, so it does need to establish a SSH connection.
What I have tried to write in this script so far is:
ssh username#ip <password>
EDIT: You should consider installing Jenkins on the remote system.
Running a command on a remote system can be done via:
ssh user#host command arg1 arg2
If you omit the password, a password prompt will appear, to get around the password prompt, you should consider setting passwordless SSH login. https://askubuntu.com/questions/46930/how-can-i-set-up-password-less-ssh-login
Rephrasing what you said, you want to write a script (namely script1.sh) which does the following (in this order):
Start a ssh connection with a remote
Execute some commands, possibly written in another script (namely script2.sh)
Close the connection / Keep it open for additional command line commands
If you want to put the remote commands on script2.sh instead of listing them in script1.sh, you need to consider that script2.sh must be on the remote server. If not, then you may consider to delegate to script1.sh the creation/copy of script2.sh on the remote machine. You may copy it in a temporary folder.
In this case, my suggestion is to write script1.sh as follows:
Copy of script2.sh with
scp /path/to/local/script2.sh user#host:/path/to/remote/script2.sh
Switch bash sheel to remote shell with
ssh user#host
Execution of script2.sh
sh /path/to/remote/script2.sh
Instead, if you prefer to list everything in just one script file, you may want to write:
Switch bash sheel to remote shell with
ssh user#host
Execution of commands
echo "This command has been executed on the remote server"
echo "This command has also been executed on the remote server"
..
Possibly closing the SSH connection to prevent that the user execute additional commands
You may consider to copy the ssh-keys on the remote server so to avoid password prompts. More information here.
So I am Trying to sync my oracle production server to oracle dr server there is a set of command that I have to run each time to remove transport and apply lag. so fistly i have to login to putty
so to automate that I created an executable
putty.sh
which reads
`putty.exe -ssh servername -l username(oracle) -pw password -m C:/auotomation/dgmgrl.sh(this is a file where my script is safe that I want to run)
the command in dgmgrl.sh file are
#echo off
dgmgrl
"and so on"
i am getting error that bash: line 3: dgmgrl: command not found
I am not getting this error when typing this command manualy
I am getting errors while running dgmgrl;
I am also getting errors while running commands like sqlplus /nolog
but commands like LS and CD are running fine
I have a redis database on a remote docker host, and I'd like to access it through a single ssh script command via plink.
The script is simple (redis-script.sh):
#!/bin/bash
echo "Enter Redis Password."
read -s pass
docker exec -it my-redis-container redis-cli -a $pass
Which works fine when I do a standard ssh connection via putty then run the script after login. I am able to enter the password and connect to the db:
Enter Redis Password.
Warning: Using a password with '-a' or '-u' option on the command line interface may not be safe.
127.0.0.1:6379>
The problem is when I use plink, my plink command line:
plink.exe -t container-host /containers/redis-script.sh
I get this:
Enter Redis Password.
Warning: Using a password with '-a' or '-u' option on the command line interface may not be safe.
[6n
One issue is the mangled characters, but the biggest issue is that I can no longer type in any commands at this point. I am able to interact when it asks for the password, but once it gets into redis-cli, I cannot type anything.
Perhaps it's the docker exec command which is messing up the interactivity?
Any help is appreciated.
I have couple of bash command that I want to execute and was hoping to get some help to write a script to execute them with one script if possible.
On one console, I want to execute a command iproxy 2222 22. The console will then print waiting for connection.
I'll have to open another console to execute a command ssh -p2222 root#localhost. Once I remote into a phone, I want to execute a simple command like ls.
I'm getting stuck on opening a second console and executing the command.
Can anyone give me some hint?
Thanks
You have several things here:
First, to run both the iproxy and the ssh in the same script, without having to use two different consoles you need to know how to run commands in background. This is easily done by appending & to the end of a command. In the next example the iproxy command will be run in background, and the ssh command will be run in the foreground at the same time:
iproxy 2222 22 &
ssh -p2222 root#localhost
Then, to execute a command over the remote shell opened by the ssh command you just need to include it as the last part of the ssh call. The next example will open a SSH connection to root#localhost on the port 2222, then it will execute a ls command in the remote shell, and finally it will close the SSH connection:
ssh -p2222 root#localhost ls
Finally, to launch a new terminal and execute a command (or a script) in it, you just need to call the type of terminal of your choice, using the -e option with the command (or the name of the script) to be executed. The next example will open a new gnome-terminal and will execute the previous ssh example:
gnome-terminal -e "ssh -p2222 root#localhost ls"
Alternatively you can open a new kconsole or a new xterm (or any other kind of terminal you may have installed in your system).
You will notice that the terminal will close itself after executing the command. If you need or want it to remain open, then you will have to modify the call according to the type of terminal you have opened:
For xterm you will need to use the -hold option.
For kconsole you will need to use the --noclose option.
For gnome-terminal this is a bit more tricky. The way to do it is to create a profile, then modify the profile preferences to hold the terminal when command exits, and referencing to this profile when calling the terminal: gnome-terminal --window-with-profile=NAMEOFTHEPROFILE -e command.
Putting all together, your script should be more or less like this:
#!/bin/bash
iproxy 2222 22 &
xterm -hold -e "ssh -p2222 root#localhost ls"
I am on a server and running a script file that has following code.
ssh username#servername
sudo su - root
cd /abc/xyz
mkdir asdfg
I am able to ssh... but then the next command is not working.. the script is not sudo-ing. any idea?
Edit: Able to create a mech id and then do the things.. though still looking for the answer to above question :|
First of all your command will "stuck" on the first line because it will go into an interactive mode. The ssh command will require a password to be provided by a user (unless there is an sshkey being used) . And if the ssh is logged into the remote server then it will wait for user commands from standard input.
Secondly the lines following the ssh command will be executed only when the first process has exited. This is why your script is not "sudoing" - it's waiting for the ssh to end.
So if your point is to run a command on a remote server then put the command as a parameter into the same line as ssh connection. In your case:
ssh user#server sudo su - root
But this will not be of satisfaction for you. I suggest you create a script of what you want to execute on the remote server and then execute the script.
ssh user#server scriptName
The sudo thing here is very tricky because again your script might get stuck in the interactive mode waiting for a password to be inserted so I suggest you think again on the basis of the script.
mb47!
You want to run the script on the remote computer, correct?
On the remote machine, create a file containing the commands you would like to execute.
Then, on the other machine, run ssh user#machine /path/to/script/you/created/earlier
I hope this helps!
ALinuxLover