Looping through file to create other command files - shell

I am trying to create a script that will automatically log me in to a specific remote device (let's call it a fw). The "command" is a bit elaborate, as we are logging in from a protected network server, and there are hundreds of these to login to.
I have created a file with two parameters (command and name) separated by "#", the first parameter is the "command" string with spaces (ie: "sudo --user ....") which I want to put (echo) into an executeable file called "name" (name of the device I want to login to).
My logic was originally:
for line in $(awk -F# '{print $1, $2}' list.txt), do touch $2; && echo "$1 > $2" && chmod +x $2; done
my end should create x number of files named "$name", each with only a one-line command of "$command" and be "executeable".
I have tried several things to make this work. I can iterate of the file with not much issue using for, while, and even [[ -n $name ]], but, this only provides me with one variable and doesn't split the line into the two I need, "$command" and "$name". Even $1 and $2 would be fine for my purposes...
While testing:
$ while IFS=# read -r line; do echo "$line"; done < list
sudo --user xxxxxxxxxxxxxx#yyyyyyyyy
sudo --user xxxxxxxxxxxxxx#yyyyyyyyy
sudo --user xxxxxxxxxxxxxx#yyyyyyyyy
even using IFS=# to split the $line - doesn't remove the "#" as expected.
for-looping:
$ for line in $(cat list); do echo $line; done
sudo --user xxxxxxxxxxxxxx
yyyyyyyyy
sudo --user xxxxxxxxxxxxxx
yyyyyyyyy
sudo --user xxxxxxxxxxxxxx
yyyyyyyyy
Trying to expand to:
bin$ for line in $(cat list); do awk -F# '{print $1, $2}' $line; done
awk: fatal: cannot open file ` xxxxxxxxxxxxxxxxx' for reading (No such file or directory)
awk: fatal: cannot open file `yyyyyyyyy
sudo --user xxxxxxxxxxxxxxxxx' for reading (No such file or directory)
awk: fatal: cannot open file `yyyyyyyyy
sudo --user xxxxxxxxxxxxxxxxx' for reading (No such file or directory)
I would like to parse (loop) through the file - separate the parms and create $name with $command inside and chmod +x $name to have an executeable that will log me in automatically to "$name" node.

I suggest inserting all your logic into the awk script.
script.awk
BEGIN {FS = "[\r#]"} # set field separator to # or <CR>
{ # for each input line
print $1 > $2; # write input 1st field to file named 2nd field
system("chmod a+x "$2); # set file named 2nd field, to be executable
}
running the script:
awk -f script.awk list.txt
input list.txt
sudo --user xxxxxxxxxxxxxx#yyyyyyyy1
sudo --user xxxxxxxxxxxxxx#yyyyyyyy2
sudo --user xxxxxxxxxxxxxx#yyyyyyyy3
output:
dudi#DM-840$ ls -l yy*
total 3
-rwxrwxrwx 1 dudi dudi 28 Jun 23 01:21 yyyyyyyy1*
-rwxrwxrwx 1 dudi dudi 28 Jun 23 01:21 yyyyyyyy2*
-rwxrwxrwx 1 dudi dudi 28 Jun 23 01:21 yyyyyyyy3*
update:
changed FS to include the <CR> char, otherwise appended to filenames (seen as ^M).

Related

bash script to access a file in a remote host three layers deep

So in the terminal I access the remote host through ssh -p then once I'm in i have to cd /directory1/directory2/. Then I want to find the latest directory which I do using ls -td -- */ | head -n 1 then using this I want to cd into that and tail -n 1 file1
All these commands work in the terminal but I want to automate it to where I can just type ./tailer.sh and have that be output.
Any ideas would be appreciated.
The shell script tailer.sh can look something like this
#!/bin/bash
ssh -p <PORT> <HOST_NAME> '( cd /directory1/directory2/ && LATEST_DIR=$(ls -td -- */ | head -n 1) && cd ${LATEST_DIR} && tail -n 1 file1 )'
Then give execute permissions to tailer.sh using chmod u+x tailer.sh
Run the script using ./tailer.sh

SCP command with a space in the target

I would like to make a scp command with a variable for the file destination but, in the variable I have a space.
~ $ target=C:/Users/exemple/a folder with space/data
~ $ scp -r -p file.txt $USER#$IP_TARGET:${target}
space/data: No such file or directory
How can I do ?
I succeeded with this :
target='"C:/Users/exemple/a folder with space/data"'
Or this :
target=\"C:/Users/exemple/a folder with space/data\"
and use
scp -r -p file.txt $USER#$IP_TARGET:"$target"

Cron with command requiring sudo

what would be my options to make a script from command where I need to put my sudo password in?
Im exporting a fsimage and would like to do it on regular basis. It could be run from my accout but ideally, I would like to create a user dedicated to make these exports.
I would like to stay away from using root cron and use a more secure way of doing this
Entire command looks like this:
sudo ssh czmorchr 'hdfs oiv -p Delimited -i $(ls -t /dfs/nn/current/fsimage_* | grep -v md5 |
head -1) -o /dev/stdout 2>/dev/null' | grep -v "/.Trash/" |sed -e 's/\r/\\r/g' | awk 'BEGIN
{ FS="\t"; OFS="\t" } $0 ! ~ /_impala_insert_staging/ && ($0 ~ /^\/user\/hive\/warehouse\/cz_prd/ ||
$0 ~ /^\/user\/hive\/warehouse\/cz_tst/) { split($1,a,"/"); db=a[5]; table=a[6]; gsub(".db$", "", table); }
db && $10 ~ /^d/ {par=""; for(i=7;i<=length(a);i++) pa r=par"/"a[i] } db && $10 !~ /^d/
{ par=""; for(i=7;i<=length(a) - 1;i++) par=par"/"a[i]; file=a[length(a)] } NR > 1 { print db,table, par, file, $0 }' |
hadoop fs -put -f - /user/hive/warehouse/cz_prd_mon_ma.db/hive_warehouse_files/fsim age.tsv
To do something as sudo without entering password, there is an unsafe way, like
echo ubuntu | sudo -S ls
here im granting an ls command with ubuntu user and ubuntu password.
As you can see, piping password to sudo -S works.
Additionaly you need to make user sudoer
here is an example https://askubuntu.com/questions/7477/how-can-i-add-a-new-user-as-sudoer-using-the-command-line.
I was able to resolve this issue using setfacl command. (Context:) I setup another folder in HDFS where standby node should dump its fsimages. Then I used this command and after that, I was able to run the script above without sudo and in a crontab.
setfacl -m u:hdfs:rwx /home/user_name/fsimage-dump/namenode

FTP not working UNIX

hi i have a script where i am performing sudo and going to particular directory,and within that directory editing files name as required. After getting required file name i want to FTP files on windows machine but script after reading FTP commands says-:
-bash: line 19: quote: command not found
-bash: line 20: quote: command not found
-bash: line 21: put: command not found
-bash: line 22: quit: command not found
My ftp is working if i run normally so it is some other problem.Script is below-:
#!/usr/bin/
path=/global/u70/glob
echo password | sudo -S -l
sudo /usr/bin/su - glob << 'EOF'
#ls -lrt
cd "$path"
pwd
for entry in $(ls -r)
do
if [ "$entry" = "ADM" ];then
cd "$entry"
FileName=$(ls -t | head -n1)
echo "$FileName"
FileNameIniKey=$(ls -t | head -n1 | cut -c 12-20)
echo "$FileNameIniKey"
echo "$xmlFileName" >> "$xmlFileNameIniKey.ini"
chmod 755 "$FileName"
chmod 755 "$FileNameIniKey.ini"
ftp -n hostname
quote USER ftp
quote PASS
put "$FileName"
quit
rm "$FileNameIniKey.ini"
fi
done
EOF
You can improve your questions and make them easier to answer and more useful for future readers by including a minimal, self-contained example. Here's an example:
#!/bin/bash
ftp -n mirrors.rit.edu
quote user anonymous
quote pass mypass
ls
When executed, you get a manual FTP session instead of a file listing:
$ ./myscript
Trying 2620:8d:8000:15:225:90ff:fefd:344c...
Connected to smoke.rc.rit.edu.
220 Welcome to mirrors.rit.edu.
ftp>
The problem is that you're assuming that a script is a series of strings that are automatically typed into a terminal. This is not true. It's a series of commands that are executed one after another.
Nothing happens with quote user anonymous until AFTER ftp has exited, and then it's run as a shell command instead of being written to the ftp command.
Instead, specify login credentials on the command line and then include commands in a here document:
ftp -n "ftp://anonymous:passwd#mirrors.rit.edu" << end
ls
end
This works as expected:
$ ./myscript
Trying 2620:8d:8000:15:225:90ff:fefd:344c...
Connected to smoke.rc.rit.edu.
220 Welcome to mirrors.rit.edu.
331 Please specify the password.
230 Login successful.
Remote system type is UNIX.
Using binary mode to transfer files.
200 Switching to Binary mode.
229 Entering Extended Passive Mode (|||19986|).
150 Here comes the directory listing.
drwxrwxr-x 12 3002 1000 4096 Jul 11 20:00 CPAN
drwxrwsr-x 10 0 1001 4096 Jul 11 21:08 CRAN
drwxr-xr-x 18 1003 1000 4096 Jul 11 18:02 CTAN
drwxrwxr-x 5 89987 546 4096 Jul 10 10:00 FreeBSD
ftp -n "ftp://anonymous:passwd#mirrors.rit.edu" << end
Name or service not known

./script.sh: line 8: /etc/passwd: Permission denied

I have this script which I can't execute:
#!/bin/bash
USERS="/etc/passwd"
for user in `$USERS | cut -f 1 -d ':'`
do
echo $user
done
This is the output of ls -l script.sh:
-rwxrwxrwx 1 user user 94 Jul 30 21:24 script.sh
What am I doing wrong? :|
I also tried running it as root and with sudo and nothing worked...it's annoying...
You're trying to execute /etc/passwd and send the output to cut. You want to redirect the contents of the file:
for user in `cut -f 1 -d : < $USERS`

Resources