Is it possible to run a bash command in irc? - bash

In ssh i can execute my bash script using
./myscript.sh -g House -y 2019 -u https://someurl.com Artist - Album
The script reads from a directory which contains sub folders of various artists but when i execute the trigger from IRC it tells me there's an invalid folder name
The irc trigger is !myscript -g House -y 2019 -u https://someurl.com Artist - Album.
As it stands I use this code to trigger the IRC command
proc dupe:myscript {nick host hand chan arg} {
set _bin "/home/eggdrop/logfw/myscript.sh"
if {[catch {exec $_bin "$arg" &} error]} {
putnow "PRIVMSG $chan :Error.. $error"
} else {
putnow "PRIVMSG $chan :Running.. $arg"
}
}
The error I'm getting is it can't find the folder name as it reporting it as -g House -y 2019 -u https://someurl.com Artist - Album
so i need irc or bash to remove the optarg parts to show just the folder name in irc please.
I think the error comes because tcl is sending a quoted string but not sure how to fix the issue

The problem is that you're sending $arg as one string rather than multiple arguments. The fix is probably to do this:
if {[catch {exec $_bin {*}$arg &} error]} {
(The rest of your code will be the same.)
You probably need to take a few extra steps to defend against assholes doing redirections and other shenanigans. That's easy enough:
proc dupe:myscript {nick host hand chan arg} {
set _bin "/home/eggdrop/logfw/myscript.sh"
# You might need this too; it ensures that we have a proper Tcl list going forward:
set arglist [split $arg]
# Check (aggressively!) for anything that might make exec do something weird
if {[lsearch -glob $arglist {*[<|>]*}] >= 0} {
# Found a potential naughty character! Tell the user to get lost…
putnow "PRIVMSG $chan :Error.. bad character in '$arg'"
return
}
if {[catch {exec $_bin {*}$arglist &} error]} {
putnow "PRIVMSG $chan :Error.. $error"
} else {
putnow "PRIVMSG $chan :Running.. $arg"
}
}

Related

Jenkins pipeline undefined variable

I'm trying to build a Jenkins Pipeline for which a parameter is
optional:
parameters {
string(
name:'foo',
defaultValue:'',
description:'foo is foo'
)
}
My purpose is calling a shell script and providing foo as argument:
stages {
stage('something') {
sh "some-script.sh '${params.foo}'"
}
}
The shell script will do the Right Thing™ if the provided value is the empty
string.
Unfortunately I can't just get an empty string. If the user does not provide
a value for foo, Jenkins will set it to null, and I will get null
(as string) inside my command.
I found this related question but the only answer is not really helpful.
Any suggestion?
OP here realized a wrapper script can be helpful… I ironically called it junkins-cmd and I call it like this:
stages {
stage('something') {
sh "junkins-cmd some-script.sh '${params.foo}'"
}
}
Code:
#!/bin/bash
helpme() {
cat <<EOF
Usage: $0 <command> [parameters to command]
This command is a wrapper for jenkins pipeline. It tries to overcome jenkins
idiotic behaviour when calling programs without polluting the remaining part
of the toolkit.
The given command is executed with the fixed version of the given
parameters. Current fixes:
- 'null' is replaced with ''
EOF
} >&2
trap helpme EXIT
command="${1:?Missing command}"; shift
trap - EXIT
typeset -a params
for p in "$#"; do
# Jenkins pipeline uses 'null' when the parameter is undefined.
[[ "$p" = 'null' ]] && p=''
params+=("$p")
done
exec $command "${params[#]}"
Beware: prams+=("$p") seems not to be portable among shells: hence this ugly script is running #!/bin/bash.

Expect fails but I don't see why

I have a bash script that gets info from Heroku so that I can pull a copy of my database. That script works fine in cygwin. But to run it in cron it halts because the shell that it uses halts as Heroku's authentication through Heroku Toolbelt.
Here is my crontab:
SHELL=/usr/bin/bash
5 8-18 * * 1-5 /cygdrive/c/Users/sam/work/push_db.sh >>/cygdrive/c/Users/sam/work/output.txt
I have read the Googles and the man page within cygwin to come up with this addition:
#!/usr/bin/bash
. /home/sam.walton/.profile
echo $SHELL
curl -H "Accept: application/vnd.heroku+json; version=3" -n https://api.heroku.com/
#. $HOME/.bash_profile
echo `heroku.bat pgbackups:capture --expire`
#spawn heroku.bat pgbackups:capture --expire
expect {
"Email:" { send -- "$($HEROKU_LOGIN)\r"}
"Password (typing will be hidden):" { send -- "$HEROKU_PW\r" }
timeout { echo "timed out during login"; exit 1 }
}
sleep 2
echo "first"
curl -o latest.dump -L "$(heroku.bat pgbackups:url | dos2unix)"
Here's the output from the output.txt
/usr/bin/bash
{
"links":[
{
"rel":"schema",
"href":"https://api.heroku.com/schema"
}
]
}
Enter your Heroku credentials. Email: Password (typing will be hidden): Authentication failed. Enter your Heroku credentials. Email: Password (typing will be hidden): Authentication failed. Enter your Heroku credentials. Email: Password (typing will be hidden): Authentication failed.
As you can see it appears that the output is not getting the result of the send command as it appears it's waiting. I've done so many experiments with the credentials and the expect statements. All stop here. I've seen few examples and attempted to try those out but I'm getting fuzzy eyed which is why I'm posting here. What am I not understanding?
Thanks to comments, I'm reminded to explicitly place my env variables in .bashrc:
[[ -s $USERPROFILE/.pik/.pikrc ]] && source "$USERPROFILE/.pik/.pikrc"
export HEROKU_LOGIN=myEmailHere
export HEROKU_PW=myPWhere
My revised script per #Dinesh's excellent example is below:
. /home/sam.walton/.bashrc echo $SHELL echo $HEROKU_LOGIN curl -H "Accept: application/vnd.heroku+json; version=3" -n https://api.heroku.com/
expect -d -c " spawn heroku.bat pgbackups:capture --expire --app gw-inspector expect {
"Email:" { send -- "myEmailHere\r"; exp_continue}
"Password (typing will be hidden):" { send -- "myPWhere\r" }
timeout { puts "timed out during login"; exit 1 } } " sleep 2 echo "first"
This should work but while the echo of the variable fails, giving me a clue that the variable is not being called, I am testing hardcoding the variables directly to eliminate that as a variable. But as you can see by my output not only is the echo yielding nothing, there is no clue that any diagnostics are being passed which makes me wonder if the script is even being called to run from expect, as well as the result of the spawn command. To restate, the heroku.bat command works outside the expect closure but the results are above. The result of the command directly above is:
/usr/bin/bash
{
"links":[
{
"rel":"schema",
"href":"https://api.heroku.com/schema"
}
]
}
What am I doing wrong that will show me diagnostic notes?
If you are going to use the expect code inside your bash script, instead of calling it separately, then you should have use the -c flag option.
From your code, I assume that you have the environmental variables HEROKU_LOGIN and HEROKU_PW declared in the bashrc file.
#!/usr/bin/bash
#Your code here
expect -c "
spawn <your-executable-process-here>
expect {
# HEROKU_LOGIN & HEROKU_PW will be replaced with variable values.
"Email:" { send -- "$HEROKU_LOGIN\r";exp_continue}
"Password (typing will be hidden):" { send "$HEROKU_PW\r" }
timeout { puts"timed out during login"; exit 1 }
}
"
#Your further bash code here
You should not use echo command inside expect code. Use puts instead. The option of spawning the process inside expect code will be more robust than spawning it outside.
Notice the use of double quotes with the expect -c flag. If you use single quotes, then bash script won't do any form of substitution. So, if you need bash variable substitution, you should use double quotes for the expect with -c flag.
To know about usage of -c flag, have a look at here
If you still have any issue, you can debug by appending -d with the following way.
expect -d -c "
our code here
"

How to find if a file exists in expect script

I have a statement inside my expect script like this
send "sed -i -e 's/$oldport/$newport/' backup.txt\r"
expect "$ "
However I wish to first check whether the file backup.txt exist and if it does, then edit it.
How do I acheive this?
Thanks
Since expect is an extension of Tcl, all Tcl commands are available to use:
if {[file exists backup.txt]} {
send "sed -i -e 's/$oldport/$newport/' backup.txt\r"
expect "$ "
}
Here is a quick approach by using ls
set file "backup.txt"
send "ls $file\r"
expect {
"\r\n$file" {puts "\nfile exists"}
"cannot access $file" {puts "\nfile not found"}
}
You can just replace puts "\nfile exists" with your statement.
A Slightly modified #asdone's solution.
Works on remote, unlike #glenn jackman’s that tests for the file in local system, on which the expect script is running.
#!/usr/bin/expect --
# settings
set prompt "\$ "
set filename 'example-file.txt'
set response__file_not_found "No such file or directory*$prompt"
set response__file_exists "$filename*$prompt"
send "ls $filename\r"
expect {
$response__file_not_found {
puts "\n\nFile not found\n\n”
# must exit as the negative response matches both cases
exit
}
$response__file_exists {
puts "\n\nFile exists\n\n"
}
}

Code to count number of types of files on client in Unix

Hi I am new to shell scripting.
my requirement is:
There is one server & 3 clients. On each client error logs files are generated which are of 4 types.Say type 1 error , type 2 error like this type 4 error.
I want to write a script which read all the 3 clients from server & provide me number of times 4 different type of error logs are genereted on each client.
In short it should use ssh & grep command combination.
I have written the demo script but it's not providing me the number of times different type of logs occured on clients.
#error[1]='Exception: An application error occurred during an address lookup request, please contact IT'
#error[2]='SocketTimeoutException: Read timed out'
#error[3]='Exception: The search has produced too many matches to be returned'
#error[4]='Exception: No matching address found'
error_1='exception 1'
error_2='exception 2'
function get_list_of_clients()
{
NUM_OF_CLIENTS=$(wc -l ${CLIENT_IP_LIST} | awk -F " " '{ print $1 }' )
echo $NUM_OF_CLIENTS
if [ "${NUM_OF_CLIENTS}" -gt 0 ]
then
for ((row=2; row<=$NUM_OF_CLIENTS; row++))
do
CLIENTS_IP=$(sed -n ${row}p ${CLIENT_IP_LIST}| awk -F " " '{print $3 }')
echo ${CLIENTS_IP}
# get_number_of_errors
# copy_count_errors
echo ${$error_$row}
done
fi
}
function get_number_of_errors()
{
for((row_no=1; row_no<=4; row_no++))
do
{
/usr/bin/expect - <<- EndMark
spawn ssh root#${CLIENTS_IP} "grep $error[$row_no] var/error.log |wc -l" >> /tmp/${CLIENTS_IP}_error${row_no}.txt
match_max 50000
expect {
"*yes/no*" {
send -- "yes\r"
send -- "\r"
exp_continue
}
"*?assword:*" {
send -- "${CLIENT_PASSWORD}\r"
send -- "\r"
}
}
expect eof
EndMark
}
done
}
function copy_count_errors()
{
/usr/bin/expect - <<- EndMark
spawn scp root#${CLIENTS_IP}:/tmp/${CLIENTS_IP}* /tmp/
match_max 50000
expect {
"*yes/no*" {
send -- "yes\r"
send -- "\r"
exp_continue
}
"*?assword:*" {
send -- "${CLIENT_PASSWORD}\r"
send -- "\r"
}
}
expect eof
EndMark
}
get_list_of_clients
================================================================================
please help.
This is not really an answer, an attempt to help you getting your own.
The Problem
If I understand it correctly:
Your script runs on the server
You have three clients, each has log files
Your list of clients is in a file named $CLIENT_IP_LIST, where the IP is the third field and the first line is some sort of header, which you want to exclude.
Suggestions
It seems you need to ssh and scp to the clients. If possible, I suggest setting up SSH Public Key Authentication between your server and clients. This will greatly simplify your scripts.
Use the -C flag for compression with the scp and ssh commands.
You should copy the log files from the clients to the server and do processing on the server.
If you can choose a different scripting language such as Python, Tcl (You used Expect, which is really Tcl). Bash can handle your problem, but you will find other languages work better. This is my opinion, of course.
Get one piece of your puzzle to work before moving on to the next. For example, right now, your get_list_of_clients() function does not yet working.
That being said, here is a rewrite of get_list_of_clients, which I tested and it works within my assumptions about your $CLIENT_IP_LIST file:
function get_list_of_clients() {
let row=1
while read line
do
if (( row > 1 ))
then
set $line # Breaks line into pieces
CLIENT_IP="$3"
echo "$CLIENT_IP"
fi
let row=row+1
done < $CLIENT_IP_LIST
}

Pastebinit script will not word wrap

I found this "googlizer" script that launches my browser and searches using whatever is highlighted on my system as the google search string. I've modified it using pastebinit package here on Ubuntu so that it takes what is highlighted, pipes it to pastebinit (using paste.ubuntu.com), and finally pipes the pastebin url back to the clipboard.
It works well except for the fact that the pasted text comes back a single line.
Here is is my script:
#!/bin/sh
exec wish "$0" "$#"
wm withdraw .
if { [catch { set word [selection get "STRING"] }] != 0 } { exit 0 }
regsub -all "( |\n)+" "$word" " " googleWord
exec echo \"$googleWord\" | pastebinit -b http://paste.ubuntu.com | xclip -selection clipboard
exit 0
How can I preserve the original pre-script formatting?

Resources