Piping commands in a shell script - shell

I want to write a script that opens a shell with a few tabs, and i want each tab to execute somthing automaticly. for some reason when i pipe the commands it does not work.
gnome-terminal \
--tab-with-profile=Titleable -t "A" \
--tab-with-profile=Titleable -t "B" -e "sudo tail -f /var/log/syslog" \
--tab-with-profile=Titleable -t "C" -e "sudo tail -f /var/log/syslog | grep txt"
for some reason for Tab A&B work but in C the grep txt is ignored.
Anyone know why?
Thanks
Mat

Use a shell to call your command:
gnome-terminal \
--tab-with-profile=Titleable -t "A" \
--tab-with-profile=Titleable -t "B" -e "sudo tail -f /var/log/syslog" \
--tab-with-profile=Titleable -t "C" -e 'sh -c "sudo tail -f /var/log/syslog | grep txt"'

Related

how to terminate a process via key stroke

I have this function on my bash script:
sudo tshark -i eth0 -T fields -e ip.src -e dns.qry.name -Y "dns.qry.name~." -q 1>>log.txt 2>/dev/null &
while true
do
cat log.txt
done
it is capturing ips and domain names in live mode and save them into log file.
how can configure this live mode to be terminated by pressing a key?
Using tee to watch log and send the command to background, then read input to terminate script
tshark -i eth0 -T fields -e ip.src -e dns.qry.name -Y "ip" -q 2>/dev/null | tee log.txt &
read -n1 c && echo "Got key $c"
exit
Note: running the command in a console will terminate it :-p

Bash: Parse Urls from file, process them and then remove them from the file

I am trying to automate a procedure where the system will fetch the contents of a file (1 Url per line), use wget to grab the files from the site (https folder) and then remove the line from the file.
I have made several tries but the sed part (at the end) cannot understand the string (I tried escaping characters) and remove it from that file!
cat File
https://something.net/xxx/data/Folder1/
https://something.net/xxx/data/Folder2/
https://something.net/xxx/data/Folder3/
My line of code is:
cat File | xargs -n1 -I # bash -c 'wget -r -nd -l 1 -c -A rar,zip,7z,txt,jpg,iso,sfv,md5,pdf --no-parent --restrict-file-names=nocontrol --user=test --password=pass --no-check-certificate "#" -P /mnt/USB/ && sed -e 's|#||g' File'
It works up until the sed -e 's|#||g' File part..
Thanks in advance!
Dont use cat if it's posible. It's bad practice and can be problem with big files... You can change
cat File | xargs -n1 -I # bash -c
to
for siteUrl in $( < "File" ); do
It's be more correct and be simpler to use sed with double quotes... My variant:
scriptDir=$( dirname -- "$0" )
for siteUrl in $( < "$scriptDir/File.txt" )
do
if [[ -z "$siteUrl" ]]; then break; fi # break line if him empty
wget -r -nd -l 1 -c -A rar,zip,7z,txt,jpg,iso,sfv,md5,pdf --no-parent --restrict-file-names=nocontrol --user=test --password=pass --no-check-certificate "$siteUrl" -P /mnt/USB/ && sed -i "s|$siteUrl||g" "$scriptDir/File.txt"
done
#beliy answers looks good!
If you want a one-liner, you can do:
while read -r line; do \
wget -r -nd -l 1 -c -A rar,zip,7z,txt,jpg,iso,sfv,md5,pdf \
--no-parent --restrict-file-names=nocontrol --user=test \
--password=pass --no-check-certificate "$line" -P /mnt/USB/ \
&& sed -i -e '\|'"$line"'|d' "File.txt"; \
done < File.txt
EDIT:
You need to add a \ in front of the first pipe
I believe you just need to use double quotes after sed -e. Instead of:
'...&& sed -e 's|#||g' File'
you would need
'...&& sed -e '"'s|#||g'"' File'
I see what you trying to do, but I dont understand the sed command including pipes. Maybe some fancy format that I dont understand.
Anyway, I think the sed command should look like this...
sed -e 's/#//g'
This command will remove all # from the stream.
I hope this helps!

Using bash and ssh how do I write a log locally from a remote host

I am trying to get data from a file from a remote host and write to a log file locally using SSH. The log file tmp_results.log is not being created. Any ideas where I 'm going wrong please?
( ssh -nq -o StrictHostKeyChecking=no \
-i $PEM_PATH/$PEM_FILE $USER#${host} -p $REMOTE_PORT \
tail -n 6 $REMOTE_HOME/data/result.jtl | >> $SCRIPT_DIR/$project/tmp_results.log)
You seems a little bit confused by using pipes and redirections of filedescriptors.
Here you write in your logfile:
ssh -nq -o StrictHostKeyChecking=no \
-i $PEM_PATH/$PEM_FILE $USER#${host} -p $REMOTE_PORT \
tail -n 6 $REMOTE_HOME/data/result.jtl > $SCRIPT_DIR/$project/tmp_results.log
If you want to append the output on existing file just use:
ssh -nq -o StrictHostKeyChecking=no \
-i $PEM_PATH/$PEM_FILE $USER#${host} -p $REMOTE_PORT \
tail -n 6 $REMOTE_HOME/data/result.jtl >> $SCRIPT_DIR/$project/tmp_results.log

Bash piping issue

I need to execute the following grep query as an argument for konsole (the kde terminal)
grep -R -i -n -A 2 -B 2 --color=always -R "searchtext" * | less -R
works for the current terminal.
konsole --workdir `pwd` -e grep -R -i -n -A 2 -B 2 --color=always -R "searchtext" * | less -R
works, but the konsole window displays the grep query without less pipe.
Ideally I want konsole to spawn as seperate process with konsole &
and send the grep command with less as an argument for konsole -e
You need to run the pipe in a shell.
konsole --workdir pwd -e bash -c 'grep -R -i -n -A 2 -B 2 --color=always -R "searchtext" * | less -R'

Wget page title

Is it possible to Wget a page's title from the command line?
input:
$ wget http://bit.ly/rQyhG5 <<code>>
output:
If it’s broke, fix it right - Keeping it Real Estate. Home
This script would give you what you need:
wget --quiet -O - http://bit.ly/rQyhG5 \
| sed -n -e 's!.*<title>\(.*\)</title>.*!\1!p'
But there are lots of situations where it breaks, including if there is a <title>...</title> in the body of the page, or if the title is on more than one line.
This might be a little better:
wget --quiet -O - http://bit.ly/rQyhG5 \
| paste -s -d " " \
| sed -e 's!.*<head>\(.*\)</head>.*!\1!' \
| sed -e 's!.*<title>\(.*\)</title>.*!\1!'
but it does not fit your case as your page contains the following head opening:
<head profile="http://gmpg.org/xfn/11">
Again, this might be better:
wget --quiet -O - http://bit.ly/rQyhG5 \
| paste -s -d " " \
| sed -e 's!.*<head[^>]*>\(.*\)</head>.*!\1!' \
| sed -e 's!.*<title>\(.*\)</title>.*!\1!'
but there is still ways to break it, including no head/title in the page.
Again, a better solution might be:
wget --quiet -O - http://bit.ly/rQyhG5 \
| paste -s -d " " \
| sed -n -e 's!.*<head[^>]*>\(.*\)</head>.*!\1!p' \
| sed -n -e 's!.*<title>\(.*\)</title>.*!\1!p'
but I am sure we can find a way to break it. This is why a true xml parser is the right solution, but as your question is tagged shell, the above it the best I can come with.
The paste and the 2 sed can be merged in a single sed, but is less readable. However, this version has the advantage of working on multi-line titles:
wget --quiet -O - http://bit.ly/rQyhG5 \
| sed -n -e 'H;${x;s!.*<head[^>]*>\(.*\)</head>.*!\1!;T;s!.*<title>\(.*\)</title>.*!\1!p}'
Update:
As explain in the comments, the last sed above uses the T command which is a GNU extension. If you do not have a compatible version, you can use:
wget --quiet -O - http://bit.ly/rQyhG5 \
| sed -n -e 'H;${x;s!.*<head[^>]*>\(.*\)</head>.*!\1!;tnext;b;:next;s!.*<title>\(.*\)</title>.*!\1!p}'
Update 2:
As above still not working on Mac, try:
wget --quiet -O - http://bit.ly/rQyhG5 \
| sed -n -e 'H;${x;s!.*<head[^>]*>\(.*\)</head>.*!\1!;tnext};b;:next;s!.*<title>\(.*\)</title>.*!\1!p'
and/or
cat << EOF > script
H
\$x
\$s!.*<head[^>]*>\(.*\)</head>.*!\1!
\$tnext
b
:next
s!.*<title>\(.*\)</title>.*!\1!p
EOF
wget --quiet -O - http://bit.ly/rQyhG5 \
| sed -n -f script
(Note the \ before the $ to avoid variable expansion.)
It seams that the :next does not like to be prefixed by a $, which could be a problem in some sed version.
The following will pull whatever lynx thinks the title of the page is, saving you from all of the regex nonsense. Assuming the page you are retrieving is standards compliant enough for lynx, this should not break.
lynx -dump example.com | sed '2q;d'

Resources