Command Line & and && - windows

so currently what I am attempting to do is, remote onto a different server, launch a scheduled task, exit, change to a specified folder on my desktop, write a file
My code currently looks as such
C:MyOriginalFolder> psexec \\MYREMOTESERVER -u MYUSERNAME cmd
C:MYREMOTESERVER> SCHTASK.......
C:MYREMOTESERVER> exit & cd C:\\Users\ce132d & echo "Logged off" > MyLog.txt
//expected: the folder C:\\Users\ce132d should have a text file called MyLog.txt
//what happens: I end up in C:MyOriginalFolder with no MyLog.txt file created
When I remove the &'s and test it command by command, all is dandy and the expected behavior happens. But when linking them together with & and &&, the expected behavior does not happen.
So my question is this: is there some way of one-lining the actions of exiting, changing directory, and writing a text file?
I am eventually going to check if loging into the remote server was successful or not, and want to put those 3 actions into a if (successful login) {do 3 tasks} else {write error log}..

You send your remote server exit & cd C:\\Users\ce132d & echo "Logged off" > MyLog.txt.
Why do you expect that the cd ... should have any effect on your local directory?
When you send the commands line by line then your exit will EXIT the remote server context, then you are again on your local machine, therefore the rest of the commands works as expected.

The > MyLog.txt is being evaluated before any of the commands are run, including the cd. You can use > C:\Users\ce132d\MyLog.txt instead.
You can also use >> instead of > to append to a file.

Related

Capture Shell Script Output and Do a String Match to execute the next command?

i have a Shell script doing the following two commands, connecting to a remote server and putting files via SFTP, lets called it "execute.sh"
sftp -b /usr/local/outbox/send.sh username#example.com
mv /usr/local/outbox/DD* /usr/local/outbox/completed/
Then in my "send.sh" i have following commands to be executed on the remote server.
cd ExampleFolder/outbox
put Files_*
bye
Now my problem is
If the first command "sftp -b" fails due to a remote connection error some network problem, it still moves the files into the "completed folder" which is incorrect, so i want some way to do the next command "mv" to be executed only if the first command to "sftp" is successfully connected.
Can we do this by enhancing this shell script ? or some work around ?
My Shell is Bash.
Simply insert && between the two commands:
sftp -b /usr/local/outbox/send.sh username#example.com && \
mv /usr/local/outbox/DD* /usr/local/outbox/completed/
If the first fails, the second one will not run.
Alternatively, you can check the exit code of the first command explicitly. The exit code of the last command is always saved in $?, and it is 0 if the command succeeded:
sftp -b /usr/local/outbox/send.sh username#example.com
if [ $? -eq 0 ]
then
mv /usr/local/outbox/DD* /usr/local/outbox/completed/
fi
If you really wanted to capture the output of the first command, you could run it in $(...) and store the value in a variable:
sftpOutput="$(sftp -b /usr/local/outbox/send.sh username#example.com)"
and then use this variable in further checks, e.g. match it against a pattern in the next if.

script file : command stopped

I made simple script. file name is sutest.
#!/bin/bash
cd ~/Downloads/redis-4.0.1/src
./redis-server
echo "uid is ${UID}"
echo "user is ${USER}"
echo "username is ${USERNAME}"
I runed script.$ . sutest
But, script code is stopped at ./redis-server.
So I can't see echo messages.
I want to make this kind of script files. How can I do that??
I would be appreciate your help.
Let's say more general case.
myscript1 file executes process like redis-server above.
another myscript2 file executes process like redis-server above.
another myscript3 file executes process like redis-server above.
How can I run three script files simultaneously??
I want to do job in ssh connection.
To make the matter worse, If I can't use screen or tmux??
Add a '&' char at the end of the row
./redis-server &
this char permits to run in backgroud the job, and the script continues.
Just do the echos first:
cd ~/Downloads/redis-4.0.1/src
echo "uid is ${UID}"
echo "user is ${USER}"
echo "username is ${USERNAME}"
exec ./redis-server
The use of exec is a small trick (which you can omit if you prefer): it replaces the shell script with redis-server, so the shell script is no longer running at all. Without exec, you end up with the shell script waiting around for redis-server to finish, which is unnecessary if the script will do nothing further.
If you don't like that for some reason, you can keep the original order:
cd ~/Downloads/redis-4.0.1/src
./redis-server & # run in background
echo "uid is ${UID}"
echo "user is ${USER}"
echo "username is ${USERNAME}"
wait # optional

Why isn't this command returning to shell after &?

In Ubuntu 14.04, I created the following bash script:
flock -nx "$1" xdg-open "$1" &
The idea is to lock the file specified in $1 (flock), then open it in my usual editor (xdg-open), and finally return to prompt, so I can open other files in sequence (&).
However, the & isn't working as expected. I need to press Enter to make the shell prompt appear again. In simpler constructs, such as
gedit test.txt &
it works as it should, returning the prompt immediately. I think it has to do with the existence of two commands in the first line. What am I doing wrong, please?
EDIT
The prompt is actually there, but it is somehow "hidden". If I issue the command
sudo ./edit error.php
it replies with
Warning: unknown mime-type for "error.php" -- using "application/octet-stream"
Error: no "view" mailcap rules found for type "application/octet-stream"
Opening "error.php" with Geany (application/x-php)
__
The errors above are not related to the question. But instead of __ I see nothing. I know the prompt is there because I can issue other commands, like ls, and they work. But the question remains: WHY the prompt is hidden? And how can I make it show normally?
Why isn't this command returning to shell after &?
It is.
You're running a command in the background. The shell prints a new prompt as soon as the command is launched, without waiting for it to finish.
According to your latest comment, the background command is printing some message to your screen. A simple example of the same thing:
$ echo hello &
$ hello
The cursor is left at the beginning of the line after the $ hello.
As far as the shell is concerned, it's printed a prompt and is waiting a new command. It doesn't know or care that a background process has messed up your display.
One solution is to redirect the command's output to somewhere other than your screen, either to a file or to /dev/null. If it's an error message, you'll probably have to redirect both stdout and `stderr.
flock -nx "$1" xdg-open "$1" >/dev/null 2>&1 &
(This assumes you don't care about the content of the message.)
Another option, pointed out in a comment by alvits, is to sleep for a second or so after executing the command, so the message appears followed by the next shell prompt. The sleep command is executed in the foreground, delaying the printing of the next prompt. A simple example:
$ echo hello & sleep 1
hello
[1] + Done echo hello
$
or for your example:
flock -nx "$1" xdg-open "$1" & sleep 1
This assumes that the error message is printed in the first second. That's probably a valid assumption for you example, but it might not be in general.
I don't think the command is doing what you think it does.
Have you tried to run it twice to see if the lock cannot be obtained the second time.
Well, if you do it, you will see that it doesn't fail because xdg-open is forking to exec the editor. Also if it fails you expect some indication.
You should use something like this
flock -nx "$1" -c "gedit '$1' &" || { echo "ERROR"; exit 1; }

Run a bash script via another bash script to delete a file is not working properly

I have a bash script start.sh which calls another run.sh, which takes me to another prompt where I have to delete a file file.txt and then exit out of that prompt.
When I call run.sh from inside start.sh, I see the prompt and I believe that it deletes the file.txt but the inner/new prompt waits for me to exit out of it while the script is running - meaning it needs intervention to proceed. How do I avoid it in bash?
In Python I can use Popen and get it going but not sure about bash.
EDIT: I would rather like to know what command to provide to exit out of the shell (generated from running run.sh") so I can go back to the prompt where "start.sh" was started.
Etan: To answer your question
VirtualBox:~/Desktop/ > ./start
company#4d6z74d:~$ ->this is the new shell
company#4d6z74d:~$ logout ---> I did a "Control D here" so the script could continue.
Relevant part of start.sh which:
/../../../../run.sh (this is the one that takes us to the new $ prompt)
echo "Delete file.txt "
rm -f abc/def/file.txt
You can run run.sh in the background using &. In start.sh, you would invoke the script via /path/run.sh &. Now, start.sh will exit without waiting for run.sh to finish (which is running in the background).

Can't output result in bash from an ant command

I am writing a bash script that modifies some config files, runs "ant ear war" as a different user, outputs the return, exits back to the root to continue with the rest of the script. The issue is that the script does not continue after exiting and I don't get an output from "ant ear war".
Thank you for the help.
here is an example
#When running the bash script i don't see the output. Maybe it's because I run it as root and switched to another_user. So I tried to outputing result into a variable and into a text file. Both failed
su another_user
cd /usr/empi/MMEMPIV741/
echo $(ant ear war) >> /tmp/empi_install.txt
varant="$?"
echo 'if zero it's success otherwise it's a failure'
cp /usr/accessmgr/AMV741/bin/am/JBoss/AccessManager.war /usr/jboss/jboss-eap-4.3/jboss-as/server/default/deploy/
cp /usr/empi/MMEMPIV741/person_project/working-dir/dist/* /usr/jboss/jboss-eap-4.3/jboss-as/server/default/deploy/
exit
#By this time above is exited from another_user and should return to root
echo $varant
echo "http://`hostname`:21080/PersonMasterIndexDQM/flex/login.jsp#"
Put the commands you want to run in a different user context into a separate script and run that script via
su another_user -c /path/to/other.sh

Resources