Keeping a terminal window open after running script from crontab - shell

I have this script
#!/bin/sh
curl -4 http://wttr.in/Colorado\ Springs
that I want to automatically execute each morning. I have my crontab entry as
* 7 * * * (path to script)
But either the script doesnt run, or it runs and then immediately closes the shell. I know that my cronjobs are running as I have other scripts for backups that run on an hourly basis but cant figure out what detail I am missing here. I found one suggestion to include $SHELL in the script but that made no difference. Any suggestions?

Usually when I have to keep the terminal open I would exec bash as my last command. I do that when I write installer script which would open terminal; do the job and get lost after that. But if there is an error then I want the terminal to stay there so that I can read the error.
exec is used to replace the current program with argument which we provide to exec.

Actually, I don't know what are you trying to achieve with this call in your crontab. Do you want to see the weather report on your terminal? Do you want to save the weather report in the file? Get it in your emails?
If you do no redirections, you'll get the report in your mail.
If you want to have it in a file, just do:
curl wttr.in/Colorado+Springs > file
If you want to have it on you terminals do
curl wttr.in/Colorado+Springs | wall
Please note that you don't need -4, http:// and you can replace \space with +.
(DISCLAIMER: I'm the author of wttr.in)

Related

calling another scripts to run in current script

I'm writing a shell script. what it does is it will create a file by the input that is received from the user. Now, i want to add the feature called "view a file" for my current script. Now, it's unreasonal to retype it again since i've already had a script that helps
I know it's crazy when it is possible to it with normal shell command. I'm actually writing a script that help me to create pages that are generated from the touch command. (this pages had attached date, author name, subjects, and title).
The question is how to call a another script or inhere another script?
Couple of ways to do this. My prefered way is by using source
You can -
Call your other script with the source command (alias is .) like this: source /path/to/script.
Make the other script executable, add the #!/bin/bash line at the top, and the path where the file is to the $PATH environment variable. Then you can call it as a normal command.
Use the bash command to execute it: /bin/bash /path/to/script

How to format the cron url when driving to a shell script

I have a shell program name myshellscript.sh and not having any luck with getting it to run by cron. Can anyone see what i am missing.. Runs perfect when using the shell terminal.. but just don't have the URL right yet to fire it off.
php /home/myuser/public_html/usr/local/cpanel/scripts/myshellscript.sh
Here is my latest attempt that does not work.. I have used WGET vice PHP before with a full URL but just have no idea if I am onthe right track or not..
Without a specific error message beyond you saying that it does not work, we can only speculate on 'common issues' that prevent cron jobs from running, i.e.: user and file permissions, missing path/environmental info, etc.
Check man page for cron - note information about setting path and environment.
Some operating systems support setting environment option directly in your 'crontab file'; others may require using full paths to executables, or perhaps allow you to 'refine' the PATH variable in your script...
Try capturing/logging cron 'errors', i.e.
*/5 * * * * /some_path/your_script.sh 2>&1 >> /tmp/cron.script.log
The above line well send both STD-out and STD-error to the file shown, and, it will 'run' every 5 minutes.
:)
Dale

Problems running bash script from incron

I have a simple incron task setup to run a command whenever a particular .json file is written-to, then closed.
/var/www/html/api/private/resources/myfile.json IN_CLOSE_WRITE,IN NO LOOP /var/www/html/api/private/resources/run_service.sh
I can see that whenever the file to written to, there is a syslog entry for the event, and the command that was triggered - along the lines of <date> - incrond: CMD (/var/www/html/api/private/resources/run_service.sh).
But nothing seems to happen...
initially I thought this would be caused by an issue with the script, but replacing the script command to something simple such as echo "hello world" > /tmp/mylog.log still yields no output or results. I seem to have hit a brick wall with this one!
Update
Changing the incron command to read "/bin/bash /var/www/html/api/private/resources/run_service.sh" now seems to triggering the script correctly, as I can now get output from the script.
A simple mistake on my part, despite all examples online showing that using the script as the command should run it, for me it only works if I explicitly call bash to execute it
"<my directory/file to watch> <trigger condition> /bin/bash /var/www/html/api/private/resources/run_service.sh

How to open a command-line application with a command without closing it?

I am trying to write a script that opens a command-line application (sagemath in this case) which on start up will send a certain command down the pipe (attach a script) without closing the application at the end.
I tried something like:
#!/bin/bash
echo "load(\"script.sage\")" | sage
This, of course, opens sage load the script print the output of the script and closes sage. Adding & at the end of the last line didn't work.
I know that technically I can add this script to the list of scripts which are loaded on startup always but this is not what I want. I thought that it might be done be making a dynamic link at some directory to my script, but not sure if there is such a directory and where it is.
Any suggestions?
Edit:
I didn't know about Expect (I'm a youngster in linux). Reading about, following Mark's suggestion, it a bit I managed to solve this. If this is of any interest to anyone in the future then this does the trick:
#!/usr/bin/expect
set timeout 20
spawn sage
expect "sage:"
send "load(\"script.sage\")\n"
interact
#!/usr/bin/expect
set timeout 20
spawn sage
expect "sage:"
send "load(\"script.sage\")\n"
interact
You could use 'screen' depending on how dynamically you need this script to run. See http://linux.die.net/man/1/screen for info on how to use screen.
You can either:
Use nohup to start the program E.g., nohup "load(\"script.sage\")" | sage.
Or, you can use the disown command.

How to test things in crontab

This keeps happening to me all the time:
1) I write a script(ruby, shell, etc).
2) run it, it works.
3) put it in crontab so it runs in a few minutes so I know it runs from there.
4) It doesnt, no error trace, back to step 2 or 3 a 1000 times.
When I ruby script fails in crontab, I can't really know why it fails cause when I pipe output like this:
ruby script.rb >& /path/to/output
I sorta get the output of the script, but I don't get any of the errors from it and I don't get the errors coming from bash (like if ruby is not found or file isn't there)
I have no idea what environmental variables are set and whether or not it's a problem. Turns out that to run a ruby script from crontab you have to export a ton of environment variables.
Is there a way for me to just have crontab run a script as if I ran it myself from my terminal?
When debugging, I have to reset the timer and go back to waiting. Very time consuming.
How to test things in crontab better or avoid these problems?
"Is there a way for me to just have crontab run a script as if I ran it myself from my terminal?"
Yes:
bash -li -c /path/to/script
From the man page:
[vindaloo:pgl]:~/p/test $ man bash | grep -A2 -m1 -- -i
-i If the -i option is present, the shell is interactive.
-l Make bash act as if it had been invoked as a login shell (see
INVOCATION below).
G'day,
One of the basic problems with cron is that you get a minimal environment being set by cron. In fact, you only get four env. var's set and they are:
SHELL - set to /bin/sh
LOGNAME - set to your userid as found in /etc/passwd
HOME - set to your home dir. as found in /etc/passwd
PATH - set to "/usr/bin:/bin"
That's it.
However, what you can do is take a snapshot of the environment you want and save that to a file.
Now make your cronjob source a trivial shell script that sources this env. file and then executes your Ruby script.
BTW Having a wrapper source a common env. file is an excellent way to enforce a consistent environment for multiple cronjobs. This also enforces the DRY principle because it gives you just one point to update things as required, instead of having to search through a bunch of scripts and search for a specific string if, say, a logging location is changed or a different utility is now being used, e.g. gnutar instead of vanilla tar.
Actually, this technique is used very successfully with The Build Monkey which is used to implement Continuous Integration for a major software project that is common to several major world airlines. 3,500kSLOC being checked out and built several times a day and over 8,000 regression tests run once a day.
HTH
'Avahappy,
Run a 'set' command from inside of the ruby script, fire it from crontab, and you'll see exactly what's set and what's not.
To find out the environment in which cron runs jobs, add this cron job:
{ echo "\nenv\n" && env|sort ; echo "\nset\n" && set; } | /usr/bin/mailx -s 'my env' you#example.com
Or send the output to a file instead of email.
You could write a wrapper script, called for example rbcron, which looks something like:
#!/bin/bash
RUBY=ruby
export VAR1=foo
export VAR2=bar
export VAR3=baz
$RUBY "$*" 2>&1
This will redirect standard error from ruby to the standard output. Then you run rbcron in your cron job, and the standard output contains out+err of ruby, but also the "bash" errors existing from rbcron itself. In your cron entry, redirect 2>&1 > /path/to/output to get output+error messages to go to /path/to/output.
If you really want to run it as yourself, you may want to invoke ruby from a shell script that sources your .profile/.bashrc etc. That way it'll pull in your environment.
However, the downside is that it's not isolated from your environment, and if you change that, you may find your cron jobs suddenly stop working.

Resources