Obfuscating a command within a shell script - bash

There are a lot of tips (and warnings) on here for obfuscating various items within scripts.
I'm not trying to hide a password, I'm just wondering if I can obfuscate an actuall command within the script to defeat the casual user/grepper.
Background: We have a piece of software that helps manage machines within the environment. These machines are owned by the enterprise. The users sometimes get it in their heads that this computer is theirs and they don't want "The Man" looking over their shoulders.
I've developed a little something that will check to see if a certain process is running, and if not, clone it up and replace.
Again, the purpose of this is not to defeat anyone other than the casual user.
It was suggested that one could echo an octal value (the 'obfuscated' command) and use it as a variable within the script. e.g.:
strongBad=`echo "\0150\0157\0163\0164\0156\0141\0155\0145"`
I could then use $strongBad within the shell script to slyly call the commands that I wanted to call with arguments?
/bin/$strongBad -doThatThingYouDo -DoEEET
Is there any truth to this? So far it's worked via command line directly into shell (using the -e flag with echo) but not so much within the script. I'm getting unexpected output, perhaps the way I'm using it?
As a test, try this in the command line:
strongBad=`echo -e "\0167\0150\0157"`
And then
$strongBad
You should get the same output as "who".
EDIT
Upon further review, the addition of the path to the echo command in the variable is breaking it. Perhaps that's the source of my issue.

You can do a rotate 13 on any command you want hidden beforehand, then just have the the obfuscated command in the shell script.
This little bash script:
#!/bin/bash
function rot13 {
echo "$#" | tr '[a-m][n-z][A-M][N-Z]' '[n-z][a-m][N-Z][A-M]'
}
rot13 echo hello, world!
`rot13 rpub uryyb, jbeyq!`
Produces:
rpub uryyb, jbeyq!
hello, world!

Related

How do I execute a terminal command that runs a bash script and makes use of a string argument?

I'm a bit confused (no experience with this stuff)
So, I'm supposed to run a terminal command that looks something like this:
./script_name file.txt "1, 2, 5"
So, the command would run a script that would take a file and print out the lines indicated in the string (this is just an example).
I'm just a bit confused how to do that in-line with running the script, like how the command indicates. I understand that I can run the script first, and then get user input for which lines, etc, which file, etc. But how is such a command like above setup and used in the script?
I can include file.txt in the same directory and I assume it can be accessed that way, so it wouldn't be a problem. It's the string that is confusing me. How would that be used and stored prior to running the script first and asking for user input?

Is it possible to make a .bat Bash hybrid?

In cmd, it is possible to use Linux commands with the ubuntu or bash commands, but they are very fickle. In batch, it is also possible to make a VBScript-batch hybrid, which got me thinking, is it possible to make a Bash-batch hybrid? Besides being a tongue-twister, I feel that Bash-batch scripts may be really useful.
What I have tried so far
So far I tried using the empty bash and ubuntu commands alone since they switch the normal command-prompt to the Ubuntu/Bash shell, but even if you put commands after the ubuntu/bash they wouldn't show or do anything.
After I tried that, I tried using the ubuntu -run command, but like I said earlier, it’s really fickle and inconsistent on what things work and what things don't. It is less inconsistent when you pipe things into it, but it still usually doesn't work.
I looked here since it seemed like it would answer my question and I tried it, but it didn't work since it required another program (I think).
I also looked to this and I guess it failed miserably, but interesting concept.
What I've gotten from all of my research is that most people think when this is mentioned of a file that could be run either as a .bat file or as .sh shell file instead of my goal, to make a file that runs both batch and Bash commands in the same instance.
What I want this for relates to my other question where I am trying to hash a string instead of a file in cmd, and you could do it with a Bash command, but I would still like to keep the file as a batch file.
Sure you can use Bash in batch, assuming it is available. Just use the command bash -c 'cmd', where cmd is the command that you want to run in Bash.
The following batch line pipes the Hello to cat -A command that prints it including the invisible symbols:
echo Hello | bash -c "cat -A"
Compare the output with the result of the version completely written in Bash:
bash -c "echo Hello | cat -A"
They will slightly differ!

missed $ while using variableName in bash script - how to catch such issues?

I have a bash script which does the following:
#!/bin/bash
moduleName=$1
someInfo=`ls | grep -w moduleName`
echo $someInfo
In the line #3, I was supposed to use $moduleName, but I missed it.
Is there any way to find such issues in Bash scripts?
I used shell Check, but it didn't report this issue.
For me, the script looks fine; it lists the files whose name contain the string moduleName.
The script is syntactically correct.
The error in line #3 is a semantic error; it changes the meaning of the script. Only a person that knows the intention of the script can detect it.
There is no way to automatically detect such errors, unless you write a software that reads your mind and knows that you intended to write $moduleName and you mistakenly wrote moduleName instead.

Bash: How to create a test mode that displays commands instead of executing them

I have a bash script that executes a series of commands, some involving redirection. See cyrus-mark-ham-spam.
I want the script to have a test mode, where all the commands run are printed instead of executing them. As you can see, I have tried to do that by just putting "echo" on the front of each command in test mode.
Unfortunately this doesn't deal with redirection - any redirections are still done, so the program leaves lots of temp files littered about the place when run in test mode.
I have tried various ways to get round this, like quoting the whole command and passing it to a function that either prints it or runs it, but either the redirections work in test mode, or they don't work in run mode.
I thought this must have come up before, and wonder if there is a known solution which does not involve every command being repeated with an if TEST round the pair?
Please note, this is NOT a duplicate of show commands without executing them because neither that question, nor its answers, covers redirection (which is the essence of this question).
I see that it is not a duplicate but there is not general solution to this. You need to look at each command separately.
As long as the command doesn't use arguments enclosed in spaces, like
cmd -a -b -c > filename
, you can quote it:
echo 'cmd -a -b -c > filename'
But real life code is more complex, sure.

How to test things in crontab

This keeps happening to me all the time:
1) I write a script(ruby, shell, etc).
2) run it, it works.
3) put it in crontab so it runs in a few minutes so I know it runs from there.
4) It doesnt, no error trace, back to step 2 or 3 a 1000 times.
When I ruby script fails in crontab, I can't really know why it fails cause when I pipe output like this:
ruby script.rb >& /path/to/output
I sorta get the output of the script, but I don't get any of the errors from it and I don't get the errors coming from bash (like if ruby is not found or file isn't there)
I have no idea what environmental variables are set and whether or not it's a problem. Turns out that to run a ruby script from crontab you have to export a ton of environment variables.
Is there a way for me to just have crontab run a script as if I ran it myself from my terminal?
When debugging, I have to reset the timer and go back to waiting. Very time consuming.
How to test things in crontab better or avoid these problems?
"Is there a way for me to just have crontab run a script as if I ran it myself from my terminal?"
Yes:
bash -li -c /path/to/script
From the man page:
[vindaloo:pgl]:~/p/test $ man bash | grep -A2 -m1 -- -i
-i If the -i option is present, the shell is interactive.
-l Make bash act as if it had been invoked as a login shell (see
INVOCATION below).
G'day,
One of the basic problems with cron is that you get a minimal environment being set by cron. In fact, you only get four env. var's set and they are:
SHELL - set to /bin/sh
LOGNAME - set to your userid as found in /etc/passwd
HOME - set to your home dir. as found in /etc/passwd
PATH - set to "/usr/bin:/bin"
That's it.
However, what you can do is take a snapshot of the environment you want and save that to a file.
Now make your cronjob source a trivial shell script that sources this env. file and then executes your Ruby script.
BTW Having a wrapper source a common env. file is an excellent way to enforce a consistent environment for multiple cronjobs. This also enforces the DRY principle because it gives you just one point to update things as required, instead of having to search through a bunch of scripts and search for a specific string if, say, a logging location is changed or a different utility is now being used, e.g. gnutar instead of vanilla tar.
Actually, this technique is used very successfully with The Build Monkey which is used to implement Continuous Integration for a major software project that is common to several major world airlines. 3,500kSLOC being checked out and built several times a day and over 8,000 regression tests run once a day.
HTH
'Avahappy,
Run a 'set' command from inside of the ruby script, fire it from crontab, and you'll see exactly what's set and what's not.
To find out the environment in which cron runs jobs, add this cron job:
{ echo "\nenv\n" && env|sort ; echo "\nset\n" && set; } | /usr/bin/mailx -s 'my env' you#example.com
Or send the output to a file instead of email.
You could write a wrapper script, called for example rbcron, which looks something like:
#!/bin/bash
RUBY=ruby
export VAR1=foo
export VAR2=bar
export VAR3=baz
$RUBY "$*" 2>&1
This will redirect standard error from ruby to the standard output. Then you run rbcron in your cron job, and the standard output contains out+err of ruby, but also the "bash" errors existing from rbcron itself. In your cron entry, redirect 2>&1 > /path/to/output to get output+error messages to go to /path/to/output.
If you really want to run it as yourself, you may want to invoke ruby from a shell script that sources your .profile/.bashrc etc. That way it'll pull in your environment.
However, the downside is that it's not isolated from your environment, and if you change that, you may find your cron jobs suddenly stop working.

Resources