Remote ssh command with arguments [duplicate] - bash

This question already has answers here:
How to use bash $(awk) in single ssh-command?
(6 answers)
Closed 5 years ago.
I know that I can run command remotely from another machine using ssh -t login#machine "command", however I am struggling to run more complex command like this:
watch "ps aux | awk '{print $1}' | grep php-fpm | wc -l"
I am trying with different kind of quotes however although watch command seems to be firing, it's showing errors like:
awk: cmd. line:1: {print awk: cmd. line:1:
^ unexpected newline or end
of string

The thing is the $ is expanded by the shell before it is passed to the ssh command. You need to deprive it of its special meaning locally by escaping it before passing it to ssh as
ssh -t login#machine watch "ps aux | awk '{print \$1}' | grep php-fpm | wc -l"
The error you are seeing is because when shell tries to expand $1 it does not find a value for it and leaves an empty string which results in incorrect number of arguments passed to awk.
Also you could replace your shell pipeline containing awk and grep with just a simple logic
ssh -t login#machine watch "ps aux | awk '\$1 == \"php-fpm\"{count++}END{print count}'

Related

how can you use awk's positional args in a shell script

I use the following snippet to kill abruptly any java process running:
ps -ef | grep java | grep -v grep | awk "{print $2}" | xargs kill -9
I would like to have this in a shell script that I can run on multiple machines but when I put it in the script and run it it takes the $2 in the awk as a passed in argument. I tried single, double and triple backslashing the $ but nothing works.
Single escape results in:
can't read "2": no such variable
Double escape results in:
awk: cmd. line:1 {print
awk: cmd. line:1: ^ unexpected newline or end of string
Triple escape results in:
awk: cmd. line:1 {print
awk: cmd. line:1: ^ unexpected newline or end of string
So I'm looking for a way to pass the $ arguments awk uses into a shell script
Could you please try following(taken inspiration from #James fine answer here).
ps -ef | awk '/[j]ava/{cmd=(cmd?cmd OFS:"")$2} END{print "kill -9 " cmd}'
This will only print the kill command and once you are happy with its results(pid numbers etc) then run following command to actually kill pids.
ps -ef | awk '/[j]ava/{cmd=(cmd?cmd OFS:"")$2} END{print "kill -9 " cmd}' | bash
How about:
$ ps -ef | awk '/[j]ava/{print $2}'
Sample output:
28510
There's a command that does this:
pkill java
pkill accepts the -signal (e.g. -9) notation just like kill, but I recommend starting with weaker signals to afford time for the killed processes to shut down properly. I usually do -HUP (hangup, -1) then standard (no flag for terminate, -TERM or -15) then -9 (kill, -KILL).
If you want to know what that'll do before running it, you can list the matching process IDs with:
pgrep java
or for more details:
pgrep -a java
(or pgrep java |xargs ps -f if you prefer)

bash variable not available after running script [duplicate]

This question already has answers here:
Global environment variables in a shell script
(7 answers)
Closed 5 years ago.
I have a shell script that assigns my IP address to a variable, but after running the script, I cannot access the variable in bash. If I put an echo in the script, it will print the variable, but it does not save it after the script is done running.
Is there a way to change the script to access it after it runs?
ip=$(/sbin/ifconfig | grep "inet " | awk '{print $2}' | grep -v 127 | cut -d":" -f2)
I am using terminal on a Mac.
A script by default runs in a a child process, which means the current (calling) shell cannot see its variables.
You have the following options:
Make the script output the information (to stdout), so that the calling shell can capture it and assign it to a variable of its own. This is probably the cleanest solution.
ip=$(my-script)
Source the script to make it run in the current shell as opposed to a child process. Note, however, that all modifications to the shell environment you make in your script then affect the current shell.
. my-script # any variables defined (without `local`) are now visible
Refactor your script into a function that you define in the current shell (e.g., by placing it in ~/.bashrc); again, all modifications made by the function will be visible to the current shell:
# Define the function
my-func() { ip=$(/sbin/ifconfig | grep "inet " | awk '{print $2}' | grep -v 127 | cut -d":" -f2); }
# Call it; $ip is implicitly defined when you do.
my-func
As an aside: You can simplify your command as follows:
/sbin/ifconfig | awk '/inet / && $2 !~ /^127/ { print $2 }'

using makefile variable in sed command [duplicate]

This question already has an answer here:
Sed command in makefile
(1 answer)
Closed 6 years ago.
I have tried putting the following command in makefile.
#get Local Ip Address
LOCALIP=$(shell ifconfig | grep -Eo 'inet (addr:)?([0-9]*\.){3}[0-9]*' | grep -Eo '([0-9]*\.){3}[0-9]*' | grep -v '127.0.0.1' | awk '{print $1}') &
#get Web Url from User
#read -p "Enter Web Url:" weburl; \
sed -e "\|$LOCALIP $weburl|h; \${x;s|$LOCALIP $weburl||;{g;t};a\\" -e "$LOCALIP $weburl" -e "}" hosts.txt
When I try to execute the command, I expected to get the sed command like following:
sed -e "\|192.168.5.1 www.weburl.com|h; \${x;s|192.168.5.1 www.weburl.com||;{g;t};a\\" -e "192.168.5.1 www.weburl.com" -e "}" hosts.txt
But, I get the following,
sed -e "\|/s/$/OCALIP eburl|h; \" hosts.txt
In Makefiles, variables longer than a single character (i.e. all variables that you're likely to define) needs to be expanded with ${varname}, not $varname. The latter would result in the value of $v concatenated with the string arname, as you discovered.
I won't start to parse the rest of that Makefile as the piping looks a bit questionable.

How do i store the output of a bash command in a variable? [duplicate]

This question already has answers here:
How do I set a variable to the output of a command in Bash?
(15 answers)
Closed 8 years ago.
I'm trying to write a simple script for killing a process. I've already read Find and kill a process in one line using bash and regex so please don't redirect me to that.
This is my code:
LINE=$(ps aux | grep '$1')
PROCESS=$LINE | awk '{print $2}'
echo $PROCESS
kill -9 $PROCESS
I want to be able to run something like
sh kill_proc.sh node and have it run
kill -9 node
But instead what I get is
kill_process.sh: line 2: User: command not found
I found out that when I log $PROCESS it is empty.
Does anyone know what I'm doing wrong?
PROCESS=$(echo "$LINE" | awk '{print $2}')
or
PROCESS=$(ps aux | grep "$1" | awk '{print $2}')
I don't know why you're getting the error you quoted. I can't reproduce it. When you say this:
PROCESS=$LINE | awk '{print $2}'
the shell expands it to something like this:
PROCESS='mayoff 10732 ...' | awk '{print $2}'
(I've shortened the value of $LINE to make the example readable.)
The first subcommand of the pipeline sets variable PROCESS; this variable-setting command has no output so awk reads EOF immediately and prints nothing. And since each subcommand of the pipeline runs in a subshell, the setting of PROCESS takes place only in a subshell, not in the parent shell running the script, so PROCESS is still not set for later commands in your script.
(Note that some versions of bash can run the last subcommand of the pipeline in the current shell instead of in a subshell, but that doesn't affect this example.)
Instead of setting PROCESS in a subshell and feeding nothing to awk on standard input, you want to feed the value of LINE to awk and store the result in PROCESS in the current shell. So you need to run a command that writes the value of LINE to its standard output, and connects that standard output to the standard input of awk. The echo command can do this (or the printf command, as chepner pointed out in his answer).
You need to use echo (or printf) to actually put the value of $LINE onto the standard input of the awk command.
LINE=$(ps aux | grep "$1")
PROCESS=$(echo "$LINE" | awk '{print $2}')
echo $PROCESS
kill -9 $PROCESS
There's no need use LINE; you can set PROCESS with a single line
PROCESS=$(ps aux | grep "$1" | awk '{print $2}')
or better, skip the grep:
PROCESS=$(ps aux | awk -v pname="$1" '$1 ~ pname {print $2}')
Finally, don't use kill -9; that's a last resort for debugging faulty programs. For any program that you didn't write yourself, kill "$PROCESS" should be sufficient.

Escape single quotes ssh remote command

I read any solutions for escape single quotes on remote command over ssh. But any work fien.
I'm trying
ssh root#server "ps uax|grep bac | grep -v grep | awk '{ print $2 }' > /tmp/back.tmp"
Don't work awk
ssh root#server "ps uax|grep bac | grep -v grep | awk \'{ print $2 }\' > /tmp/back.tmp"
....
awk: '{
awk: ^ caracter ''' inválido en la expresión
And try put single quotas on command but also don't work.
Aprecite help
The ssh command treats all text typed after the hostname as the remote command to executed. Critically what this means to your question is that you do not need to quote the entire command as you have done. Rather, you can just send through the command as you would type it as if you were on the remote system itself.
This simplifies dealing with quoting issues, since it reduces the number of quotes that you need to use. Since you won't be using quotes, all special bash characters need to be escaped with backslashes.
In your situation, you need to type,
ssh root#server ps uax \| grep ba[c] \| \'{ print \$2 }\' \> /tmp/back.tmp
or you could double quote the single quotes instead of escaping them (in both cases, you need to escape the dollar sign)
ssh root#server ps uax \| grep ba[c] \| "'{ print \$2 }'" \> /tmp/back.tmp
Honestly this feels a little more complicated, but I have found this knowledge pretty valuable when dealing with sending commands to remote systems that involve more complex use of quotes.
In your first try you use double-quotes " so you need to escape the $ character:
ssh root#server "ps uax|grep bac | grep -v grep | awk '{ print \$2 }' > /tmp/back.tmp"
▲
Also, you can use:
ps uax | grep 'ba[c]' | ...
so then you don't need the grep -v grep step.

Resources