taking input in while loop overrides the same line in bash script - bash

when i try to read command line arguments using while loop in bash it overwrites the prompt how i can prevent this problem and i cant remove the -r flag in read command if i do that i won't be able to use arrow keys
sample of code is here
while :
do
#easy life things
name=$(whoami)
prompt=$'\033[1;33m;-)\033[0m\033[1;31m'${name}$'\033[0m\033[1;34m#'$(hostname)$'\033[0m\033[1;32m>>\033[0m\033[1;31mšŸ•øļø \033[0m' ;
echo -n -e "${blue}"
read -r -e -p "${prompt} " cmd
history -s ${cmd}
echo -n -e "${nc}"
//the code that got erased doesn't have any problem the problem is with the read command
done
i was expecting that it should not overwrite the same line without removing those two flags in read command

The script snippet you gave does not represent the environment or code which generated the conditions shown in the image provided.
What you show in the image is a condition some encounter when the PS1 prompt string is incorrectly defined, containing some malformed color-coding sequences.
You are facing the same conditions encountered in the situation presented in this other Question, where I gave a detailed explanation for the conditions there.

Related

Read variable line by line in shell (dash included)

I have a variable with stored logs. What I want is to read every line of the variable containing logs and keep that line or remove base on some filtering stuff.
The problem is that my code is working on bash, but not working on the dash.
this is my code:
filtered_logs=""
while IFS= read -r line
do
...(store line to $filterered_logs if it comes throught filter )
done <<< "$logs"
logs="$filtered_logs"
this code is working in bash. But ' done <<< "$logs" ' is not working in dash (since is default sh in ubuntu). It's homework and I need, that it works on every shell possible.
What i tried was:
filtered_logs=""
echo "$logs" |
while IFS= read -r line
do
...(store line to $filterered_logs if it comes throught filter )
done
logs="$filtered_logs"
But if I store something in from while cycle to $filtered_logs, it's not working. And I can't access this while cycling with my debugger too. (I think that the whole while cycle is a whole new process since I run it with |.
My question is how to make it works, please. Thank you
It's easy to use clean posix api in your case:
logs=$(
printf "%s" "$logs" |
while IFS= read -r line
do
echo "store this to logs"
done
)

Is it possible to add an empty line before and after the output of a command, but only if the command didnā€™t output anything?

Just to give a quick example:
~$ cd ~/Documents/
~$ ls
a file another file
~$ echo ā€œOkā€
Ok
~$ rm *
~$ ls
~$ cd
This is inspired by this question, where trap 'echo' DEBUG is used to add an empty line before and after the output of every command. Unfortunately, empty lines are still added if the command had no output. Another downside of trap 'echo' DEBUG is that it is a bit ā€˜hackyā€™ and Iā€™ve heard of it breaking some programs. It also adds empty lines between the output of commands in a pipe.
Now, I realise that it may not be possible to add an empty line before the output of a command because, to check if the command has any output, the output has to first be printed to the screen, and then you canā€™t add the empty line! Because of this it would also be OK if thereā€™s an empty line between each prompt when running a command without output, just not that thereā€™s two lines, like with trap 'echo' DEBUG.
The reason Iā€™m doing this is I often find it difficult to find my prompt in a sea of output, especially when it is a colourful diff. I have tried two-line prompts, sometimes with empty lines before, and also adding a hideous mess of colours to my prompt. Neither has been quite satisfactory.
Thanks for the help!
You can simply do:
echo ""
Although for formatted outputs ending with a new line, printf is a better choice, for example :
printf "%s\n\n" "output"
I think the following thread is on your topic, for empty lines:
What is the preferred method to echo a blank line in a shell script?
As for controling the output try placing the command execution inside a variable and then evaluate the way you want.
For example with ls and if:
list=$(ls | wc -l)
if [ "$list" -gt 0 ]
then
echo "Working fine"
else
echo ""
fi
You can still print the output of the comand if you need to. But I think it is not necessary if there isn't some kind of reporting involved.
For example to print the output in both cases:
Add the following to both when the condition is fulfilled and when it is not (execution of the command again):
echo "$(ls)"
For example altering the condition to not fulfill it gets the desired output:
adama#galactica:~$ ./processing.sh
a
column2.txt
pepito
processing.sh
test.txt
Best Regards

Command only getting first line of multiline here string

I'm trying to pass a here string to a command that expects three values to be passed interactively. It seems like it should be simple enough, but for some reason, the program seems to only be receiving the first line of the here string properly and ignoring everything after the first \n.
Here is what I'm trying:
command <<< $'firstValue\nsecondValue\nthirdValue\n'
If anyone could tell me what I'm missing, I'd appreciate it greatly. I'm not sure if it's relevant or not, but the second value contains a space. I'm running this on a Mac.
I would maybe recommend setting up a while read for your here arguments:
#!/bin/bash
read -r -d '' vals <<EOT
first value
second value
third value
EOT
command <<< "$vals"
If you wanted to run the command each time on each argument:
while read -r src; do command "$src" ; done<<<"$vals"
Since you need the arguments run one at a time it might be easier to manage, and then you won't need to worry about the newline \n issues.
It turns out that the command I was passing the here string to couldn't handle the input fast enough from the here string. I ended up using the following workaround:
(printf 'value1\n'; sleep 2; printf 'value2\n'; sleep 2; printf 'value3\n') | command

Bash scripted curl commands producing different results than manual runs

I have a text file of roughly 900 cURLs to run. They are pretty hairy, with tons of quotes, apostrophes and other special characters.
To run them I have been trying to create a bash script to loop through the list:
#!/bin/sh
OLDIFS=$IFS
IFS="&&&"
echo "getting started"
cat staging_curl_script|while read line
do
$line
done
echo "done"
Unfortunately I have had an unusual issue. commands that run fine in the command prompt are returning the "file name too long" error. I echoed out these commands from the script and compared them to the manually run command, and they are identical.
Any idea why I am seeing different results?
silly mistake here, needed bash -c "$line"

gnome terminal tabs open multiple ssh connections

I have a file with a list of servers:
SERVERS.TXT:
192.168.0.100
192.168.0.101
192.168.0.102
From a gnome terminal script, I want open a new terminal, with a tab for each server.
Here is what I tried:
gnome-terminal --profile=TabProfile `while read SERVER ; do echo "--tab -e 'ssh usr#$SERVER'"; done < SERVERS.TXT`
Here is the error:
Failed to parse arguments: Argument to "--command/-e" is not a valid command: Text ended before matching quote was found for '. (The text was ''ssh')
Tried removing the space after the -e
gnome-terminal --profile=TabProfile `while read SERVER ; do echo "--tab -e'ssh usr#$SERVER'"; done < SERVERS.TXT`
And I get a similar error:
Failed to parse arguments: Argument to "--command/-e" is not a valid command: Text ended before matching quote was found for '. (The text was 'usr#192.168.0.100'')
Obviously there is a parsing error since the the shell is trying to be helpful by using the spaces to predict and place delimiters. The server file is changed without notice and many different sets of servers need to be looked at.
I found this question while searching for an answer to the issue the OP had, but my issue was a little different. I knew the list of servers, they where not in a file.
Anyway, the other solutions posted did not work for me, but the following script does work, and is what I use to get around the "--command/-e" is not a valid command" error.
The script should be very easy change to suit any need:
#!/bin/sh
# Open a terminal to each of the servers
#
# The list of servers
LIST="server1.info server2.info server3.info server4.info"
cmdssh=`which ssh`
for s in $LIST
do
title=`echo -n "${s}" | sed 's/^\(.\)/\U\1/'`
args="${args} --tab --title=\"$title\" --command=\"${cmdssh} ${s}.com\""
done
tmpfile=`mktemp`
echo "gnome-terminal${args}" > $tmpfile
chmod 744 $tmpfile
. $tmpfile
rm $tmpfile
Now the big question is why does this work when run from a file, but not from within a script. Sure, the issue is about the escaping of the --command part, but everything I tried failed unless exported to a temp file.
I would try something like:
$ while read SERVER;do echo -n "--tab -e 'ssh usr#$SERVER' "; \
done < SERVERS.txt | xargs gnome-terminal --profile=TabProfile
This is to avoid any interpretation that the shell could do of the parameters (anything starting with a dash).
Because it is concatenating strings (using -n), it is necessary to add an space between them.
Is this a problem of parsing command-line options? Sometimes if you have one command sending arguments to another command, the first can get confused. The convention is to use a -- like so:
echo -- "--tab -e 'ssh usr#$SERVER'";
Try to type
eval
before gnome terminal command.
it should be something like this:
eval /usr/bin/gnome-terminal $xargs
worked for me!

Resources