I just started to learn shells scripting.
Trying to run script in cron, without success. In telnet when i run "sh.script.sh" output is as expected. What I miss?
script.sh is in /usr/bin folder.
#!/bin/sh
var1 = $(opkg update)
echo ${var1}
try simply this:
#!/usr/bin/env bash
var1=$(opkg update)
echo $var1
Related
I have a RHEL box (bash) and I have SSH'd to an ESXi (sh) from it.
Now on ESXi I have created a simple script
#!/bin/sh
echo hello
exit
This only exits the script. I want to exit the script + exit the ESXi shell and return to my original RHEL bash.
Thanks much.
If you are only SSHing in for the purpose of running this command, then instead you could just have the ssh run the command for you:
[RHEL]$ ssh user#ESXi '/tmp/myscript.sh'
...and if you needed to interact with the script, or watch it's output, add the -t switch:
[RHEL]$ ssh -t user#ESXi '/tmp/mysctipt.sh'
Remove the shebang ie do
echo hello && exit
save it as script and then source the script like
. script
I am trying to write a watchdog for a Ruby application. So far, I have a cron job which is successfully calling a shell script:
#!/bin/sh
if ps -ef | grep -v grep | grep adpc.rb ; then
exit 0
else
NOW=$(date +"%m-%d-%Y"+"%T" )
echo "$NOW - CRITIC: ADPC service is down! Trying to initialize..." >> che.log
cd lib
nohup ruby adpc.rb &
exit 0
fi
This code runs correctly from command line, but I am not able to make the shell script execute the Ruby script when called from a cron job.
Please any help would be appreciated.
The Ruby file has +x permissions.
The nohup.out file is empty.
Solution: replace bare "ruby" command with full path (which ruby output).
Thanks to all for the replies =)
This is usually caused by an incorrect environment. Check the Ruby output in the created nohup.out file and log the stderr of nohup itself to a file.
It's frequently solved by starting the script with:
#!/bin/bash
source ~/.bash_profile
source ~/.bashrc
This will ensure that you run with bash instead of sh, and that any settings like PATH you've configured in your init files will be set.
I have this .sh script:
#!/bin/bash
CLASSPATH=$1
PACKAGE=$2
INPUT_FILE=$3
javac -classpath $CLASSPATH classes/$PACKAGE/$INPUT_FILE.java
cp classes/$PACKAGE/$INPUT_FILE.class ../WEB-INF/classes/$PACKAGE/
eval "$CATALINA"
And $CATALINA is set on .bashrc:
CATALINA_PATH="/var/local/tomcat/bin/catalina.sh"
CATALINA="sh $CATALINA_PATH stop && sh $CATALINA_PATH run"
But when I execute my .sh script it doesn't execute the command inside $CATALINA.
Am I doing something wrong?
Thanks!
Since I'm running a shell script, it uses a sub-shell so it cannot access the parent shell's environment.
Problem solved running the shell script in this way:
. ./script.sh
My default shell is bash. I have set some environment variables in my .bashrc file.
I installed a program which use .cshrc file. It contains the path to several cshell scripts.
When I run the following commands in the shell windows it works perfectly :
exec csh
source .cshrc
exec bash
I have tried to put these commands in bash script, unfortunately it didn't work.
is there another way to write a script in order to get the same result as running commands from a shell windows.
I hope my question is now clear
Many thanks for any help
WARNING : don't put the following script in your .bashrc, it will reload bash and so reload .bashrc again and again (stopable with C-c anyway)
Use preferable this script in your kit/CDS stuff startup script. (cadence presumably)
WARNING 2 : if anything in your file2source fails, the whole 'trick' stops.
Call this script : cshWrapper.csh
#! /bin/csh
# to launch using
# exec cshWrapper.csh file2source.sh
source $1
exec $SHELL -i
and launch it using
exec ./cshWrapper.csh file2source.sh
it will : launch csh, source your file and came back to the same parrent bash shell
Example :
$> ps
PID TTY TIME CMD
7065 pts/0 00:00:02 bash
$>exec ./cshWrapper.csh toggle.csh
file sourced
1
$> echo $$
7065
where in my case i use the file toggle.csh
#! /bin/csh
# source ./toggle.csh
if ! $?TOGGLE then
setenv TOGGLE 0
endif
if ($?TOGGLE) then
echo 'file sourced'
if ($TOGGLE == 0) then
setenv TOGGLE 1
else
setenv TOGGLE 0
endif
endif
echo $TOGGLE
Hope it helps
New proposal, since I faced another problem with exec.
exec kills whatever remains in the script, except if you force a fork by using a pipe after it `exec script |cat'. In such case if you have environment variable in the script, they are not spread back to the script itself, which is not what we want. The only solution I found is to use 3 files (let's call them for the example : main.bash that call first.cshrc and second.sh).
#! /bin/bash
#_main.bash_
exec /bin/csh -c "source /path_to_file/cshrc; exec /bin/bash -i -c /path_to_file/second.sh"
# after exec nothing remains (like Attila the Hun)
# the rest of the script is in 'second.sh'
With that manner, i can launch in a single script call, an old cshrc design kit, and still process some bash command after, and finally launch the main program in bash (let say virtuoso)
Is it possible to source a .bshrc file from .cshrc in a non-interactive session?
I'm asking because tcsh is our default shell at work and the .cshrc has to be used to set up the environment initially.
However, I am not really familiar with the tcsh and I have my own set-up in bash, so right now I have the following lines at the end of my .cshrc file:
if ( $?prompt && -x /bin/bash) then
exec /bin/bash
endif
This works fine, loading my environment from .bashrc and giving me a bash prompt for interactive sessions but now I also need the same set-up for non-interactive sessions, e.g. to run a command remotely via SSH with all the correct PATHs etc.
I can't use 'exec' in that case but I can't figure out how to switch to bash and load the bash config files "non-interactively".
All our machines share the same home directory, so any changes to my local *rc files will affect the remote machiens as well.
Any ideas welcome - thank you for your help!
After some more research I'm now quite sure that this won't work, but of course feel free to prove me wrong!
To load the environment in bash I have to switch to a bash shell. Even if that is possible "in the background", i.e. without getting a prompt, it would still break any tcsh commands which would then be attempted to execute under bash.
Hmmmm, back to the drawing board...
If $command is set there are arguments to csh, so it is a remote shell command. This works for me in .cshrc:
if ($?command) then
echo Executing non-interactive command in bash: $command $*
exec /bin/bash -c "${command} $*"
endif
echo Interactive bash shell
exec bash -l
Test:
$ ssh remotehost set | grep BASH
BASH=/bin/bash
...
proves that it ran in Bash.