Passing salt minion id to Jenkins pipeline from a flat file - jenkins-pipeline

I'm looking for some ideas there. I have a set of Jenkins pipeline jobs. The Target machine /Minion name is passed as a parameter into the job and the jobs are running fine. I have been asked to drive it through a file. .i.e. all the minion ids or target machine is listed in a flat file and I want Jenkins to pick machine names in the loop and execute the pipeline. The pipeline runs salt state files in the background. Any idea, how to achieve this.

If you have the file minions.txt containing list of minions, e.g.
minion_01
minion_02
minion_03
salt can target the minions by list when you use -L/--list option.
You can call e.g. test.ping on these minions by the following command:
salt --list `awk -vORS=, '{ print $1 }' minions.txt | sed 's/,$/\n/'` test.ping
ORS is Output Record Separator in awk and you ask awk to print the file line by line but output the lines using , as separator. sed removes the last entry.
Finally, you can wrap it in Jenkins with sh """ ... """

Related

Bash - SSH to remote server and retrieve data

I have a bash script that needs to connect to another server for parts of it's execution. I have tried many of the standard instructions and syntaxes for executing ssh commands, but with little progress.
On the remote server, I need to source a shell script that contains several env parameters for some software. One of these parameters are then used in a filepath to point to an executable, which contains a function ' -lprojects ' that can list the projects for the software on that server.
I have verified that running the commands on the server itself works multiple times. My issue is when I try to run the same commands over SSH. If I use the approach where I use the env variable for the filepath, it shows that the variable is null in the filepath, giving a file/directory not found error. If I hard-code the filepath to point to the executable, it gives me an error saying that the shell script is not sourced (which I assume it needs for other functions and apis for the executable to reveal it's -lprojects function)
Here is how the code looks like somewhat:
ssh remote.server 'source /filepath/remotescript.sh'
filelist=$(ssh remote.server $REMOTEVARIABLE'/bin/executable -lprojects')
echo ${filelist[#]}
for file in $filelist
do
echo $file
ssh SERVER2 awk 'something' /filepath/"$file"/somefile.txt | sed 'something' >> filepath/values.csv;
done
As you can see, I then also need to loop through the contents of the -lprojects output in the remote.server, do some awk and sed on the files to extract the wanted text (this works), but then I need to write that back to the client (local server) values.csv file. This is more generic, as there will be several servers I have to do this for, but all of them have to write to the same .csv file. For simplicity, you can just regard this as a one remote server case, since it is vital I get it working for at least one now in the beginning.
Note that I also tried something like:
ssh remote.server << EOF
'source /filepath/remotescript.sh'
filelist=$(ssh remote.server $REMOTEVARIABLE'/bin/executable -lprojects')
EOF
But with similar results. Also placing the single-quotes in the filelist both before and after the remotevariable, etc.
How do I go about properly doing this?
To access the environment variable, you must source the script that defines the environment within the same SSH call as the one where you are using it, otherwise, you're running your commands in two different shells which are unrelated:
filelist=$(ssh remote.server 'source /filepath/remotescript.sh; $REMOTEVARIABLE/bin/executable -lprojects')
Assuming executable outputs one file name per line, you can use readarray to achieve the effect :
readarray -t filelist < <(ssh remote.server '
source /filepath/remotescript.sh
$REMOTEVARIABLE/bin/executable -lprojects
'
)
echo ${filelist[#]}
for file in $filelist
do
echo $file
ssh SERVER2 awk 'something' /filepath/"$file"/somefile.txt | sed 'something' >> filepath/values.csv;
done

how to write a script in linux for login into several serval one by one and fetch their description?

how to write a script in linux to login into several serval one by one and fetch their description ?
I am learning shell scripting, tried few commands but not able to arrange logic into code.
As my question gives a wide range and possiblities to think,so to be exact I want to create a script that opens a file in /tmp name as 'list' which contains many IP addresses, Then I have to login in those IP's one by one by ssh command and for loop and then after logging in by using awk command I want to fetch 7th column which gives info about server.
I am just on intial stage of shell scripting donot ahve that understanding,
I tried below command but this didn't worked out for me.
for line in cat /tmp/list
do
echo $i
echo "***********"
ssh $i |grep tsm |awk -F : '{print $7, "\t", $1}'
echo
done

Want to read variable value from remote file

In one of my bash script I want to read and use the variable value from other script which is on remote machine.
How should I go ahead to resolve this. Any related info would be helpful.
Thanks in advance!
How about this (which is code I cannot currently test myself):
text=$(ssh yourname#yourmachine 'grep uploadRate= /root/yourscript')
It assumes that the value of the variable is contained in one line. The variable text now contains you variable assignment, presumably something like
uploadRate=1MB/s
There are several ways to convert the text/code into a real variable assignment in your current script, like evaluating the string or using grep. I would recommend
uploadRate=${text#*=}
to just remove the part up and including the =.
Edit: One more caveat to mention is that this only works if the original assignment does not contain variable references itself like in
uploadRate=1000*${kB}/s
ssh user#machine 'command'
will print the standard output of the remote command.
I would tell two ways at least:
1) You can simply redirect output to a file from remote server to your system with scp command...It would work for you.Then your script on your machine should read that file as an argument...
script on your machine:
read -t 50 -p "Waiting for argumet: " $1
It waits for output from remote machine,
Then you can
sshpass -p<password> scp user#host:/Path/to/file /path/to/script/
What you need to do:
You should tell the script from your machine, that the output from scp command is the argument($1)
2)Run script from your machine:
#!/bin/bash
script='
#Your commands
'
sshpass -p<password> ssh user#host $script
And you have also another ways to run script to do sth with remote machine.

Copy Bash profile from remote system up to a certain line number point

I would like to copy a file from a remote machine onto my local machine, up to the first line containing a certain pattern.
Scenario: update my local Bash profile with a part of the remote Bash profile, up to the point in which my admin has verified it.
Is there a better way (I guess there likely is!) than this quick "shell scripting" hack?
ssh tinosino#robottinosino-wifi cat /Users/tinosino/.profile | sed '/Verify this script further than this point/,$ d' > /home/tinosino/Desktop/tinosino_bash_profile.sh
Remote machine: robottinosino-wifi (OSX)
Sentinel line: Verify this script further than this point
I can use basic shell scripting, preferably in Bash as it's the default, or the most common diff/source-control bins..
The idea, you guess it, is to ultimately automate this process. Cron? Any idea as to how you would do this? The start of my Bash profile should come from the server, the "rest" is free for me to customise.
Prev failed attempts of mine:
using head
using process substituion <( ... )
using grep
using a local named pipe (this was fun: the named pipe needs a program generating its text though, executing something like the cat->sed line above)
Important note: what would be highly desirable is for the remote system not to go through the entire file, but to truncate the filter once it "sees" the sentinel line.. If pattern is in line #300 of 1,000,000,000.. just go over 300 lines.
The problem is that your sed command is structured to read through the entire file.
You can use sed -n '/Verify this script/q; p' to instead quit once the line is found:
ssh tinosino#robottinosino-wifi cat /Users/tinosino/.profile | sed -n '/Verify this script/q; p' > /home/tinosino/Desktop/tinosino_bash_profile.sh
Or without the useless use of cat, which doesn't make a significant difference in this case, but which will transfer less data if you want to remove multiple sections later:
ssh tinosino#robottinosino-wifi "sed -n '/Verify this script/q; p' /Users/tinosino/.profile" > /home/tinosino/Desktop/tinosino_bash_profile.sh
Just perform the filtering on the remote server.
ssh tinosino#robottinosino-wifi sed -n 'p;/Verify.../q' /Users/tinosino/.profile \
>>/home/tinosino/Desktop/tinosino_bash_profile.sh
The -n flag and the p and q commands together print only the lines up to, but not including, the first line that starts with "Verify...".

Shell script runs from command line, not cron

I have a script that updates a server with some stats once per day. The script works as intended when running from command line, but when running from cron some of the variables are not passed to curl.
Here is an example of the code:
#!/bin/sh
PATH=/bin:/sbin:/usr/bin:/usr/sbin
/bin/sh /etc/profile
MACADDR=$(ifconfig en0 | grep ether | awk '{print $2}')
DISKUSED=$(df / | awk '{print $3}' | tail -n1)
DISKSIZE=$(df / | awk '{print $2}' | tail -n1)
# HTTP GET PARAMS
GET_DELIM="&"
GET_MAC="macaddr"
GET_DS="disk_size"
GET_DU="disk_used"
# Put together the query
QUERY1=$GET_MAC=$MACADDR$GET_DELIM$GET_DS=$DISKSIZE$GET_DELIM$GET_DU=$DISK_USED
curl http://192.168.100.150/status.php?$QUERY1
The result in the cron job is http://192.168.100.150/status.php?macaddr=&disk_size=&disk_used=
I am not sure if it is some problem with the variables, or possibly with awk trying to parse data with no terminal size specified, etc.
Any help is appreciated.
When you're running into problems like this it's almost always an environment issue.
Dump the results of "env" to a file and inspect that. You can also run your script with top line of
#!/bin/sh -x
to see what's happening to all the variables. You might want to use a wrapper script so you can redirect the output this provides for analysis.
Very first command in your script ifconfig is found in /sbin/ifconfig on Mac. And the default PATH variable for cron jobs is set to: /usr/bin:/bin That's the reason probably rest of your commands are also failing.
It is better to set the PATH manually at the top of your script. Something like:
export PATH=$PATH:/sbin
One problem I've run into with crons is that variables you take for granted do not exist. The main one you take for granted is the path variable.
Echo what you have set as your path when being run from the command line and put that in the top of your script (or in the top of the crontab).
Alternatively, specify the full path to each command - ifconfig, awk, grep, etc.
I would guess that will fix the problem.

Resources