Insert output of schemer to Terminator config - bash

My problem consists mainly of ignorance towards the different linux commands. I've run into a wall as I'm at a loss of words to correctly look up the commands I need.
However, I'd like to take the string that schemer (github.com/thefryscorer/schemer" outputs and insert it into my Terminator config at the line which starts with "palette=", replacing the existing info.
The purpose is to set this to run at intervals to keep my cycling wallpaper list updated with my bash colors.
If you could point me towards a place to learn of such automation and usage of commands, I'd be grateful.

Running Cinnamon 2.6.13 on Arch Linux, I wrote this code that, from a directory defined, takes a random file and applies it as the wallpaper. Afterwards it runs schemer and copies the newly generated config into Terminators config directory.
#!/bin/bash
#Todo;
currentmonth=August2015
#directory of wallpaper
currentfilename="$(ls ~/Pictures/wallpapers/"$currentmonth" | shuf -n 1)"
#set $currentfilename to random filename from $currentmonth
gsettings set org.cinnamon.desktop.background picture-uri file:///home/cogitantium/Pictures/wallpapers/"$currentmonth"/"$currentfilename"
#set wallpaper as current filename from default directory.
~/Go/bin/schemer -term="terminator" ~/Pictures/wallpapers/"$currentmonth"/"$currentfilename" > ~/Scripts/temp/currentpalette
#generate palette and redirect output to temporary palette file
echo "$currentmonth - $currentfilename - $currentpalette"
currentpalette="$(cat temp/currentpalette)"
#set $currentpalette to currentpalette
touch "temp/config"
sed -i "s/^.*\bpalette\b.*$/$currentpalette/g" "temp/config"
#insert generated palette into terminator config
cp "temp/config" "~/.config/terminator/config"
It does contain some errors and behaves irregularly at times. Furthermore, Terminator doesn't seem to react to the changes, even after a killall. I'll update my answer, should I find a solution.

The two things you'll need are sed and cron.
sed is a stream editor. In particular, it uses regular expressions to allow you to search through text files and replace parts of your text.
In particular, if you have
#conf.config
palette=REPLACE_ME
other.var=numbers
You can say sed s/REPLACE_ME/replaced_text/g conf.config
And sed will replace the text in conf.config with that second argument ("replace text").
So that'll be in a bash script you write.
What you'll want to do then in regularly execute your script, which you do by setting up a cron job. This will regularly execute a file.
You can either put your shell script in any one of the /etc/cron folders etc/cron.hourly, etc/cron.daily, etc, or
enter crontab -e at the terminal to open your personal cron configuration file, for more fine-grained control over when your commands execute.
The format of a cron command is as follows:
minute hour day-of-month month day-of-week command
So you could execute a command (as is explained in the sources below) once a week on Monday afternoon with
30 17 * * 1 /path/to/command (30=min, 17=5pm, and 1 is Monday as Sunday is 0-indexed).
or every 15 minutes with
*/15 * * * * /path/to/command
And your command would be ~/scripts/myscript.sh or whatever bash script has your sed command.
So you could have your cron job run schemer, then in the same script, put that string in your config file. You'll probably need to reload your config file, and the command for that (to stick at the end of your script) is source [path_to_config]
And there you go. Run schemer regularly, grab the string, stick it in your config file, and reload the file. cron, some basic bash (as I am unfamiliar with schemer, or the nature of it's output beyond "it's a string"), sed, and source.
Sources for cron jobs, as I am less familiar with them than with sed:
https://help.ubuntu.com/community/CronHowto
https://askubuntu.com/questions/2368/how-do-i-set-up-a-cron-job

Related

Setting the current date into a variable in a Script in bash

So for the life of me I cannot figure out why my script will not take my date command as a variable. I have a script that is run every time a message is received and procmail filters specific messages by their subject line. The script looks like this:
d=$(date +%Y%m%d)
:0 wc
* ^(From|subject).*xxx
| cat&>/xx/xx/xx/xx/MSG:$d && \
chmod xxx /xx/xx/xx/xx/MSG:$d && \
/xx/xx/xx/otherscript.sh /xx/xx/xx/xx/MSG:$d
I have run the date command plenty of times in other scripts and to stdout without any issue, so I am wondering if this is a procmail issue? I have looked at a few different sites about this but still have not found a solution. My end goal is to create unique file names as well as for organization purposes each time a new email comes in.
The other reason for me believing it has something to do with procmail is that it was working fine just 3 months ago (didn't change any files or permissions). I have even tried several variations (only showing a few examples):
$'date +"%Y%m%d"'
$(date)
echo $(date)
I get a variety of files created ranging with it printing MSG:(date), MSG:(date ,etc. MSG:(date appears to like it tries to read the variable but is getting cut off or the space between date and + is causing an issue.
And at the end of my script I send it to another script which also creates a new file with the date appended and it works just fine:
fileOut="/xxx/xxx/xxx/xxx.$v.$(date +"%Y%m%d-%H%M%S").xxx"
prints: xxx.YH8AcFV9.20160628-090506.txt
Thanks for your time :-)
Procmail does not support the modern POSIX shell command substitution syntax; you need to use backticks.
d=`date +%Y%m%d` # or just date +%F
If you want to avoid invoking an external process, the From_ pseudo-header contains a fresh date stamp on many architectures.

Get sed to ignore special characters in file?

I'm trying to extract user's crontabs so as to view them together. The initial problem I ran into was that the crontab file (From crontab -l) contains a lot of commented lines placed there by the system to explain the file's function. I stole a sed snippet to deal with this that deletes lines starting with comments and replaces the rest of lines following comments with blanks. (As best as I understand it.)
Here's an example crontab I'd like to capture:
0 0 5 * * /home/thornegarvin/myscript.sh
The sed code I'm using is: (With croneditor.temp containing the crontab)
sed '/^[[:blank:]]*#/d;s/#.*//' croneditor.temp
I think that the command is matching the *s in the file as comments and then deleting the line, but I'm not sure if that's why the command is failing.
I need a version of this command or another one entirely that works as I intended (Grabbing crontabs from the output of crontab -l).

Execute a bash script the first time shell opens

Is there a way to execute a bash script automatically on a daily basis, that is I want the bash script to be executed every day the first time I open a shell terminal?
Thanks!
Bash has two files, from the user perspective, that perform "setup" when it is launched:
.bash_profile - This file is executed whenever you open an interactive login shell. This file may also be named .profile in certain distributions or configurations. .profile is usually used for non-Bash specific configuration items. Also be aware that if you have the little used .bash_login, .bash_profile will prevent that file from being used, though it is otherwise equivalent. .bash_profile is standard.
.bashrc - This file is executed for all other bash instances. Note that it is common for people to call .bashrc from .bash_profile to create consistency.
A login shell is spawned when you login; via ssh, telnet, at a console, etc. You can also force the launch of a login shell (forcing .bash_profile) to be processed by starting a shell under su like so:
su - username
Here, the dash indicates that this should be processed as a login shell.
Neither of these seem to be the correct answer for your question, however, unless you are certain to login once each day and only once each day.
A better approach in your case would be to use the cron. Crontab allows you to schedule jobs to run at any desired interval. For daily execution, you would likely want a line configured like so:
0 5 * * * /home/user/script
This would cause the user's script to execute at 5am every day. The columns are:
0 5 * * *
^ ^ ^ ^ ^------ Day of week
^ ^ ^ ^-------- Month of year
^ ^ ^---------- Day of month
^ ^------------ Hour of day
^-------------- Minute of hour
Each of those fields can also represent a comma separated list or even an arithmetic expression. For example, the following will execute the script four times during the 5 AM hour:
*/4 5 * * *
If you want the script to run when you open the shell terminal only, add it to your ~/.bashrc, /etc/bash.bashrc or /etc/bashrc file. This will execute anytime an interactive non login shell is started.
If you want it to execute daily, create a cron for it in /etc/crontab or crontab -e
While informative, the provided answers don't actually solve for the original requirement.
The request is for a script to be run once a day, at the first login, and not again the rest of the day, but then again upon the first login the next day, and the next, etc...
To achieve this you can place the script you want to execute in ~/bin or whatever location you want it in.
At the bottom of your script.sh add these three lines which will remove the execution of the script upon subsequent logins.
cat ~/.bash_profile | grep -v script.sh > bash_profile.tmp
rm -f ~/.bash_profile
mv bash_profile.tmp ~/.bash_profile
What these three lines do:
Reads in your .bash_profile and writes everything EXCEPT the line that contains script.sh to a tmp file
Deletes the existing bash profile that contains the execution of your script
Renames the tmp file, without the script.sh line, to be your new .bash_profile on subsequent logins.
THEN, use 'crontab -e' to add a line to the crontab, that will put back that line to your .bash_profile every morning at a time you would deem to be after your last login of the day but before your first login of the day.
This example is set for zero minutes + four hours, or 4:00am.
0 4 * * * echo "~/bin/script.sh" >> ~/.bash_profile
A problem exists with this, however.
If, for example, the user only logs into the system M-F and not on Sat or Sun. The crontab will still add the line to the profile Sat and Sun morning, meaning that come Monday morning there will be three identical lines. This will cause the script to run three times.
To mitigate this, an IF statement is wrapped around it to check if the command already exists in the file before adding it.
0 4 * * * if [ "$(grep -c '~/bin/script.sh' ~/.bash_profile)" -eq 0 ]; then echo "~/bin/script.sh" >> ~/.bash_profile ; fi
The end result is:
Your script will execute upon first login of the day
At the end, your script will remove the trigger for it to execute on login
Your script will not execute on subsequent logins
Every morning, crontab will add the trigger back to your .bash_profile for it to be executed on the next login, which would be the first login that day.
But crontab will only add the trigger if it doesn't already exist.
note: This is likely not the most efficient or eloquent solution, but it does work, and is pretty simple to understand.

Bash script to edit a bunch of files

To process a bunch of data and get it ready to be inserted into our database, we generate a bunch of shell scripts. Each of them has about 15 lines, one for each table that the data is going. One a recent import batch, some of the import files failed going into one particular table. So, I have a bunch of shell scripts (about 600) where I need to comment out the first 7 lines, then rerun the file. There are about 6000 shell scripts in this folder, and nothing about a particular file can tell me if it needs the edit. I've got a list of which files that I pulled from the database output.
So how do I write a bash script (or anything else that would work better) to take this list of file names and for each of them, comment out the first 7 lines, and run the script?
EDIT:
#!/usr/bin/env sh
cmd1
cmd2
cmd3
cmd4
cmd5
cmd6
cmd7
cmd8
Not sure how readable that is. Basically, the first 7 lines (not counting the first line) need to have a # added to the beginning of them. Note: the files have been edited to make each line shorter and partially cut off copying out of VIM. But in the main part of each file, there is a line starting with echo, then a line starting with sqlldr
Using sed, you can specify a line number range in the file to be changed.
#!/bin/bash
while read line
do
# add a comment to beginning of lines 1 - 7 and rename the script
sed '3,9 s/^/#/' $line > $line.new
exec $line.new
done < "filelist.txt"
You may wish to test this before running it on all of those scripts...
EDIT: changed the lines numbers to reflect comments.
Roughly speaking:
#!/bin/sh
for file in "$#"
do
out=/tmp/$file.$$
sed '2,8s/^/#/' < $file > $out
$SHELL $out
rm -f $out
done
Assuming you don't care about checking for race conditions etc.
ex seems made for what you want to do.
For instance, for editing one file, with a here document:
#!/bin/sh
ex test.txt << END
1,12s/^/#/
wq
END
That'll comment out the first 12 lines in "test.txt". For your example you could try "$FILE" or similar (including quotes!).
Then run them the usual way, i.e. ./"$FILE"
edit: $SHELL "$FILE" is probably a better approach to run them (from one of the above commenters).
Ultimately you're going to want to use the linux command sed. Whatever logic you need to place in the script, you know. But your script will ultimately call sed. http://lowfatlinux.com/linux-sed.html

How to test things in crontab

This keeps happening to me all the time:
1) I write a script(ruby, shell, etc).
2) run it, it works.
3) put it in crontab so it runs in a few minutes so I know it runs from there.
4) It doesnt, no error trace, back to step 2 or 3 a 1000 times.
When I ruby script fails in crontab, I can't really know why it fails cause when I pipe output like this:
ruby script.rb >& /path/to/output
I sorta get the output of the script, but I don't get any of the errors from it and I don't get the errors coming from bash (like if ruby is not found or file isn't there)
I have no idea what environmental variables are set and whether or not it's a problem. Turns out that to run a ruby script from crontab you have to export a ton of environment variables.
Is there a way for me to just have crontab run a script as if I ran it myself from my terminal?
When debugging, I have to reset the timer and go back to waiting. Very time consuming.
How to test things in crontab better or avoid these problems?
"Is there a way for me to just have crontab run a script as if I ran it myself from my terminal?"
Yes:
bash -li -c /path/to/script
From the man page:
[vindaloo:pgl]:~/p/test $ man bash | grep -A2 -m1 -- -i
-i If the -i option is present, the shell is interactive.
-l Make bash act as if it had been invoked as a login shell (see
INVOCATION below).
G'day,
One of the basic problems with cron is that you get a minimal environment being set by cron. In fact, you only get four env. var's set and they are:
SHELL - set to /bin/sh
LOGNAME - set to your userid as found in /etc/passwd
HOME - set to your home dir. as found in /etc/passwd
PATH - set to "/usr/bin:/bin"
That's it.
However, what you can do is take a snapshot of the environment you want and save that to a file.
Now make your cronjob source a trivial shell script that sources this env. file and then executes your Ruby script.
BTW Having a wrapper source a common env. file is an excellent way to enforce a consistent environment for multiple cronjobs. This also enforces the DRY principle because it gives you just one point to update things as required, instead of having to search through a bunch of scripts and search for a specific string if, say, a logging location is changed or a different utility is now being used, e.g. gnutar instead of vanilla tar.
Actually, this technique is used very successfully with The Build Monkey which is used to implement Continuous Integration for a major software project that is common to several major world airlines. 3,500kSLOC being checked out and built several times a day and over 8,000 regression tests run once a day.
HTH
'Avahappy,
Run a 'set' command from inside of the ruby script, fire it from crontab, and you'll see exactly what's set and what's not.
To find out the environment in which cron runs jobs, add this cron job:
{ echo "\nenv\n" && env|sort ; echo "\nset\n" && set; } | /usr/bin/mailx -s 'my env' you#example.com
Or send the output to a file instead of email.
You could write a wrapper script, called for example rbcron, which looks something like:
#!/bin/bash
RUBY=ruby
export VAR1=foo
export VAR2=bar
export VAR3=baz
$RUBY "$*" 2>&1
This will redirect standard error from ruby to the standard output. Then you run rbcron in your cron job, and the standard output contains out+err of ruby, but also the "bash" errors existing from rbcron itself. In your cron entry, redirect 2>&1 > /path/to/output to get output+error messages to go to /path/to/output.
If you really want to run it as yourself, you may want to invoke ruby from a shell script that sources your .profile/.bashrc etc. That way it'll pull in your environment.
However, the downside is that it's not isolated from your environment, and if you change that, you may find your cron jobs suddenly stop working.

Resources