How to execute a text extracted through grep and sed in bash - bash

I am trying to execute a command based on extracting it from README file.
I was able to extract it using grep and sed:
cat README.md | grep -i "docker build" | grep -vi "dockerfile.debug" | sed 's/.*\(d[a-z]\).*/\1/'
This script would give a result something like 'docker build .'
I want to execute that command.
But I am not sure how to execute the extracted text. I thought 'exec' would work but I couldn't apply it. Please help me find a way to execute the text extracted through the above script.

Set your command in
$(CommandToExecute)
or back-ticks
`CommandToExecute`
As Example:
$(cat README.md | grep -i "docker build" | grep -vi "dockerfile.debug" | sed 's/.*\(d[a-z]\).*/\1/'
);

try:
$(grep -i "docker build" README.md | grep -vi "dockerfile.debug" | sed 's/.*\(d[a-z]\).*/\1/')

Related

How to run a command like xargs on a grep output of a pipe of a previous xargs from a command in Bash

I'm trying to understand what's happening here out of curiosity, even though I can just copy and paste the output of the terminal to do what I need to do. The following command does not print anything.
ls /opt/local/var/macports/registry/portfiles -1 | sed 's/-.*//g' | sort -u | parallel "sudo port -N install" {} 2>&1 | grep -Po "Use '\K.*(?=')" | parallel "{}"
The directory I call ls on contains a bunch of filenames starting with the string I want to extract that ends at the first dash (so stringexample-4.2009 pipes stringexample into parallel (like xargs but to run each line separately). After running the command sudo port install <stringexample>, I get error outputs like so:
Unable to activate port <stringexample>. Use 'port -f activate <stringexample>' to force the activation.
Now, I wish to run port -f activate <stringexample>. However, I cannot seem to do anything with the output port -f activate gettext that I get to the terminal.
I cannot even do ... | grep -Po "Use '\K.*(?=')" | xargs echo or ... | grep -Po "Use '\K.*(?=')" >> commands_to_run.txt (the output stream to file only creates an empty file), despite the shorter part of the command:
ls /opt/local/var/macports/registry/portfiles -1 | sed 's/-.*//g' | sort -u | parallel "sudo port -N install {}" 2>&1 | grep -Po "Use '\K.*(?=')"
printing the commands to the terminal. Why does the pipe operator not work here? If the commands I wish to run are outputting to the terminal, surely there's got to be a way to capture them.

How to escape shell script to run it using command_string option?

I am trying to escape the following script to run it via shell command-string option ( /bin/sh -c ).
privateIP=$(ifconfig eth0 | grep "inet " | awk \'{print $2}\');
sed -i "s/http:\/\/:/http:\/\/$privateIP:/g" init.conf
Please elaborate on the answer.
You're question is not clear, but perhaps you are looking for:
sh -c 'privateIP=$(ifconfig eth0 | awk "/inet/{print \$2}");
sed -i "s#http://:#http://$privateIP:#g" init.conf'

Saving files to separate html file before using grep/sed

I'm working on a project that lets me navigate some urls. Right now I have:
#!/bin/bash
for file in $1
do
wget $1 >> output.html
cat output.html | grep -o '<a .*href=.*>' |
sed -e 's/<a /\n<a /g' |
sed -e 's/<a .*href=['"'"'"]//' -e 's/["'"'"'].*$//' -e '/^$/ d' |
grep 'http'
done
I want the user to be able to run the script as follows:
./navigator google.com
which will save the source of url into a new html file, which will then run my grep/seds and then save to a new file.
Right now I'm struggling with saving the url into a new html file. Help!
To create a new file for each URL, use url in your output filename for wget -O option:
#!/bin/bash
for url; do
out="output-$url.html"
wget -q "$url" -O "$out"
grep -o '<a .*href=.*>' "$out" |
sed -e 's/<a /\n<a /g' |
sed -e 's/<a .*href=['"'"'"]//' -e 's/["'"'"'].*$//' -e '/^$/ d' |
grep 'http'
done
PS: As per comments above, added -q in wget to make it totally quiet.

Shell script to make directories and subdirectories with variable names

I'm trying to create script to be run by cron to create multiple folders with subfolders.
DATE=`date +%Y-%m-%d`
IP_ADDR=`ifconfig | grep -v '127.0.0.1' | sed -n 's/.*inet addr:\([0-9.]\+\)\s.*/\1/p'`
/bin/mkdir -p /mnt/db-backup/12/$DATE/$IP_ADDR/
If i run this script manually everything is created as expected. When script is ran by cron subdirectory $IP_ADDR is not created and there is no errors.
I suspect that /sbin is not part of the PATH for the environment that the cron job runs under. You should specify the full path for the ifconfig command:
IP_ADDR=$(/sbin/ifconfig | grep -v '127.0.0.1' | sed -n 's/.*inet addr:\([0-9.]\+\)\s.*/\1/p')
It's also better practice (in general) to use $() for command substitution.
Try to use debug mode :
set -x
DATE=`date +%Y-%m-%d`
IP_ADDR=`ifconfig | grep -v '127.0.0.1' | sed -n 's/.*inet addr:\([0-9.]\+\)\s.*/\1/p'`
/bin/mkdir -p /mnt/db-backup/12/$DATE/$IP_ADDR/
set +x
Then, redirect the output of your cron to a file and have a look, you should find useful information in it.
You are not far off, but there are several ordering caveats that could cause problems. Many systems have different formats for the ifconfig output line. Some with inet xxx.xxx.xxx.xxx, others with inet addr:xxx.xxx.xxx.xxx. (those are the two most common). You may also need to handle the case where there are multiple wired inet interfaces (2+ NICs in the box). However, if you have only 1 NIC, you could try the following to handle the common ifconfig formats:
DATE=`date +%Y-%m-%d`
IP_ADDR=$(ifconfig |
grep -v '127.0.0.1' |
grep -E 'inet[ ](addr:)*[0-9]{1,3}([.][0-9]{1,3}){3}' |
sed -e 's/^.*inet \(addr:\)*//' -e 's/ .*$//')
/bin/mkdir -p /mnt/db-backup/12/$DATE/$IP_ADDR/
or with IP_ADDR written as one line:
IP_ADDR=$(ifconfig | grep -v '127.0.0.1' | grep -E 'inet[ ](addr:)*[0-9]{1,3}([.][0-9]{1,3}){3}' | sed -e 's/^.*inet \(addr:\)*//' -e 's/ .*$//')

curl complex usage with pattern

I'm trying to get 2 files using curl based on some pattern but that doesn't seem to work:
Files:
SystemOut_15.04.01_21.12.36.log
SystemOut_15.04.01_15.54.05.log
curl -f -k -u "login:password" https://myserver/cgi-bin/logviewer/index.cgi?getlogfile=SystemOut_15.04.01_21.12.36.log'&'server=qwerty123.com'&'numlines=100000000'&'appenv=MBL%20-%20PROD'&'directory=/app/WAS/was85/profiles/node/logs/mbl-server1
I know there is -A key but it doesn't work since my file is inside the link.
How can I extract those 2 files using a pattern?
Did that myself. One curl gets the list of logs on the webpage. Another downloads those files.
The code looks like:
for file in $(curl -f -k -u "user:pwd" https://selfservice.pwj.com/cgi-bin/logviewer/index.cgi?listdirectory=/app/smx_client_mob/data/log'&'appenv=MBL%20-%20PROD'&'server=xshembl04pap.she.pwj.com | grep href | sed 's/.*href="//' | sed 's/".*//' | sed 's/javascript:getLog//g' | sed "s/['();]//g" | grep -i 'service' | grep '^[a-zA-Z].*'); do
curl -o $file -f -k -u "user:pwd" https://selfservice.pwj.com/cgi-bin/logviewer/index.cgi?getlogfile="$file"'&'server=xshembl04pap.she.pwj.com'&'numlines=100000000'&'appenv=MBL%20-%20PROD'&'directory=/app/smx_client_mob/data/log; done

Resources