Setting environment variables from two lines of output - bash

I have a external program (kind of an authentication token generator) that outputs two lines, two parts of my authentication.
$ get-auth
SomeAuthString1
SomeAuthString2
Then, I want to export these strings into an environment variable, say, AUTH and PIN.
I tried several things in bash, but nothing works.
For example, I can do:
get-auth | read -d$'\4' AUTH PIN
but it fails. AUTH and PIN remain unset. If I do
get-auth | paste -d\ -s | read AUTH PIN
it also fails. The only way I can get the data is by doing
get-auth | { read AUTH; read PIN; }
but obviously only in the subshell. Exporting from that has no result
A bit of research, and I found this answer that might mean that I can't do that (reading a variable from something piped into a read). But I might be wrong. I also found that if I open a subshell with { before the read, the values are available in the subshell until I finish it with }.
Is there any way I can set environment variables from the two-line output? I obviously don't want to save that to a file, and I don't want to set up a FIFO just for that. Are those the only ways of getting that done?

You can use bash process-substitution, <() to achieve the same using the read command.
$ cat file
123
456
With using command-substitution properly you can retain the command output. Using read and \n as the de-limiter as;
$ read -r -d'\n' a b < <(cat file)
$ printf "%s %s\n" "$a" "$b"
123 456
Now the variables are available in the current shell, you can always export it.

The subshell was the issue. I was able to make it work by doing:
read -d $'\4' AUTH PIN < <(get-auth)

Related

How to set variables whose name and values are taken from a command's output with `declare` in bash?

I need to declare variables in a bash shell script for which both names and values are taken from another command's output.
For the sake of this question, I will use a temporary file tmp:
$ cat tmp
var1="hello world"
var2="1"
... and use it for my mock command below.
In the end, I need to have the variables $var1 and $var2 set with respectively hello world and 1, with the variable names var1 and var2 taken directly from the input.
Here is what I got so far:
$ cat tmp|while read line; do declare $line; done
I know I don't need to use catbut this is to simulate the fact that the input is taken from the output of an other command and not in a file.
This doesn't work. I get:
bash: declare: `world"': not a valid identifier
and
$ echo $var1; echo $var2
$
I don't understand why this doesn't work since I can do this:
declare var1="hello world"
... with expected result. I assumed this would be equivalent, but I'm clearly wrong.
I found this answer as the closest thing to my problem, but not quite since it relies on a file to source. I would like to avoid that. I found other answers that uses eval but I'd prefer to avoid that as well.
Maybe there are subtleties in the use of quotes I don't understand.
If the only way is to use a temporary file and source it that is what I'll do, but I think there must be another way.
A good suggestion when writing a shell script is that always double quoting the variable. Otherwise, it will be affected by the shell word splitting.
while read line; do
declare "$line"
done < <(echo "var1=hello world")
And why echo "var1=hello world" | while read line; do export "$line"; done won't work? Because pipe is a sub-shell, it creates var1 in the sub-shell, it won't impact the current shell. So it can't be set in the current shell.
As an alternative, use process substitution, you can obtain the output as a temporary file. So it will create the variable in the current shell.

Bash export variables received lines by lines

I got a simple problem but very boring.
The goal is to write a shell script that run on EC2 instance to exports tags for rest of the script... Something like:
ec2-describe-tags [...]
| while IFS=':' read name value; do
export "$name"="$value"
done
Not so uggly but don't work, of course cause the export is in the while loop, executed in pipe.
My question is: how to write this correctly? Of course I cannot predict names nor numbers of received tags.
Try this:
while IFS=: read name value; do
export $name="$value"
done < <(ec2-describe-instance ...)
A pipeline runs the commands in a subshell, so the variables don't persist when it's done.
Since it seems that the output consists solely of lines of the form name:value, you should just be able to do
while read; do
export "$REPLY" # Using default variable set by read
done < <( ec2-describe-tags ... | sed 's/:/=' )
You could even get fancy with the readarray command (if available) and simply run
readarray -t env_vars < <(ec2-describe-tags ... | sed 's/:/=')
export "${env_vars[#]}"
The process substitution allows the while loop to run in the current shell, so the exported variables will be put in the shell's environment.
If you are using bash 4.2 or later, you can set the lastpipe option, which allows the last command in a pipe to run in the current shell instead of a subshell, allowing you to keep your current pipeline.

name of current variable in bash pipe

In powershell $_ is the name of the current variable being passed with pipes. What is the equivalent of this in Bash?
Let's say I want to do this
echo "Hi" | echo "$_"
prints Hi
Thanks
Bash (or any other Unix shell for that matter) has no such thing.
In PowerShell, what is passed through pipes is an object. In bash, this is the output (to stdout) of the command.
The closes thing you can do is use while:
thecmd | while read theline; do something_with "$theline"; done
Note that IFS (Input Field Separator) is used, you can therefore also do:
thecmd | while read first therest; do ...; done
In Unix ideology there are no objects or variables. The main selling poing of Unix is that everything is plain text, so you just pass text from one command to another.
You can think that you just have one variable and you always use it implicitly. Your example is a bit weird, but the closest thing I can think of is cat command that takes whatever its input is (think about the only implicit variable) and outputs it, so
echo "Hi" | cat
prints
Hi
If you really need to access this value as a variable (for whatever reason) you can "fake" it using a subcommand in parenthesis as a variable name like this:
echo "Hi" | echo "$(cat)"

How to execute lines of text on the clipboard as bash commands

I'm working with Mac OS X's pbpaste command, which returns the clipboard's contents. I'd like to create a shell script that executes each line returned by pbpaste as a separate bash command. For example, let's say that the clipboard's contents consists of the following lines of text:
echo 1234 >~/a.txt
echo 5678 >~/b.txt
I would like a shell script that executes each of those lines, creating the two files a.txt and b.txt in my home folder. After a fair amount of searching and trial and error, I've gotten to the point where I'm able to assign individual lines of text to a variable in a while loop with the following construct:
pbpaste | egrep -o [^$]+ | while read l; do echo $l; done
which sends the following to standard out, as expected:
echo 1234 >~/a.txt
echo 5678 >~/b.txt
Instead of simply echoing each line of text, I then try to execute them with the following construct:
pbpaste | egrep -o [^$]+ | while read l; do $l; done
I thought that this would execute each line (thus creating two text files a.txt and b.txt in my home folder). Instead, the first term (echo) seems to be interpreted as the command, and the remaining terms (nnnn >~/...) seem to get lumped together as if they were a single parameter, resulting in the following being sent to standard out without any files being created:
1234 >~/a.txt
5678 >~/b.txt
I would be grateful for any help in understanding why my construct isn't working and what changes might get it to work.
[…] the remaining terms (nnnn >~/...) seem to get lumped together as if they were a single parameter, […]
Not exactly. The line actually gets split on whitespace (or whatever $IFS specifies), but the problem is that the redirection operator > cannot be taken from a shell variable. For example, this snippet:
gt='>'
echo $gt foo.txt
will print > foo.txt, rather than printing a newline to foo.txt.
And you'll have similar problems with various other shell metacharacters, such as quotation marks.
What you need is the eval builtin, which takes a string, parses it as a shell command, and runs it:
pbpaste | egrep -o [^$]+ | while IFS= read -r LINE; do eval "$LINE"; done
(The IFS= and -r and the double-quotes around $LINE are all to prevent any other processing besides the processing performed by eval, so that e.g. whitespace inside quotation marks will be preserved.)
Another possibility, depending on the details of what you need, is simply to pipe the commands into a new instance of Bash:
pbpaste | egrep -o [^$]+ | bash
Edited to add: For that matter, it occurs to me that you can pass everything to eval in a single batch; just as you can (per your comment) write pbpaste | bash, you can also write eval "$(pbpaste)". That will support multiline while-loops and so on, while still running in the current shell (useful if you want it to be able to reference shell parameters, to set environment variables, etc., etc.).

Shell script pipeline: different behavior with or without xclip

I have a script named password-for-object which I normally run like that:
$ password-for-object example.com
sOtzC0UY1K3EDYp8a6ltfA
I.e. it does an intricate hash calculation and outputs a password that I should use when accessing an object (for example, website) named example.com. I'll just double click the whole password, it gets copied into my buffer and I'll paste it into the form.
I've also learnt a trick on how to use such a script without making my password visible:
$ password-for-object example.com | xclip
This way output of a script ends up in X's primary buffer and I can insert it right into password field in the form and it's not shown on the screen.
The only problem with this way is that password-for-object outputs a string with trailing newline and thus "xclip" always catches up an extra symbol - this newline. If I omit output of newline in password-for-object, then I'll end up with messed up string without xclip, i.e. when I'm just putting it on the stdout. I use 2 shells: zsh and bash, and I'll get the following in zsh (note the extra % sign):
$ password-for-object example.com
sOtzC0UY1K3EDYp8a6ltfA%
$
Or the following in bash (note that prompt would be started on the same line):
$ password-for-object example.com
sOtzC0UY1K3EDYp8a6ltfA$
Any ideas on how to work around this issue? Is it possible to modify the script in a way so it will detect that xclip is in the pipeline and only output newline if it isn't?
If you change password-for-object so that it doesn't output a newline, you can call it with a script like:
#!/bin/bash
password-for-object "$1"
if [ -t 1 ]
then
echo
fi
The -t condition is described in the bash manual as:
-t fd
True if file descriptor fd is open and refers to a terminal.
See the following question:
How to detect if my shell script is running through a pipe?
Give this a try:
$ password-for-object example.com | tr -d '\n' | xclip
tr -d '\n' deletes the newline

Resources