Multiple variables definition in a bash sourced file behave differently - bash

I have a .env file with the following:
FST_TEST=1
SCD_TEST=2
I run source .env and then:
If I run echo "$FST_TEST$FST_TEST" it prints 1.
If I run echo "$SCD_TEST$SCD_TEST" it prints 22.
I would have expected echo "$FST_TEST$FST_TEST" to also prints 11 but I can't manage to do it... I think there is something with the Return character.

Most likely, it's because .env file is in Windows/DOS format.
Can you do :
dos2unix .env
With DOS end of line, FST_TEST=1, is actually FST_TEST=1\r.
\r makes the cursor go to beginning of the line, so the 1
you saw is two 1s, one on top the other.
On the second line, you didn't put end of line, so there was no problem.

Not sure what does your .env file contains. Can you send its content??
Maybe try this:
echo ${FST_TEST}${FST_TEST}
Hope it helps.

Related

CMD/BATCH BAT SCRIPT How to add digit to file without space

i want to make a start.config file by using bat script.
I want to add to my config file that parameter:
StartAgents=9
WITHOUT space at the end of line.
Unfortunately command:
echo StartAgents=9>>C:\start.config
not working, I think that there is "collision" with two characters: "9>"
Command:
echo StartAgents=9 >>C:\start.config
is working, but this is adding space at the end of line in my config file - i dont want that.
Any ideas how to do that?
I want to add line StartAgents=9 without space af the end of line.
want:
StartAgents=9
dont want:
StartAgents=9
You have to escape the number, so that it isn't interpreted by CMD.EXE as a file descriptor number.
Then you can add the >> redirection directly after the number to not insert a trailing space.
Example:
echo StartAgent=^9>>test.txt
Related Information:
Omitting trailing space
File Descriptor Usage

How to grab value back from external script in bash?

I'm sure I'm missing something stupid. I want to pass a full path variable to a perl script, where I do some work on it and then pass it back. So I have:
echo "Backing up: $f ";
$write_file="$(perl /home/spider/web/foo.com/public_html/gen-path.cgi $f)";
echo "WRITE TO: $write_file \n";
However, this gives me:
Backing up: /home/spider/web/foo.com/public_html/websites-uk/uk/q/u
backup-files-all.sh: line 7: =backup-uk-q-u.tar.gz: command not found
WRITE TO: \n
I can't work out why its not saving the output into $write_file. I must be missing something (bash isn't my prefered language, which is why I'm passing to Perl as I'm a lot more fluent in that :))
Unless your variable write_file already exists, the command $write_file="something" will translate to ="something"(1).
When setting a variable, leave off the $ - you only need it if you want the value of the variable.
In other words, what you need is (note no semicolons needed):
write_file="$(perl /home/spider/web/foo.com/public_html/gen-path.cgi $f)"
(1) It can be even hairier if it is set to something. For example, the code:
write_file=xyzzy
$write_file="something"
will result in something being placed into a variable called xyzzy, not write_file :-)

Echo variable defined as result from command line resets cursor position

If I run this script:
#!/bin/bash
HOSTNAME=$(< ds.tmp)
echo "Hello${HOSTNAME}!"
TEST="1.2.3.4"
echo "Hello${TEST}!"
With the contents of ds.tmp only an ip address (say 1.2.3.4), the result is:
!ello1.2.3.4
Hello1.2.3.4!
So after I print a variable that is assigned by a $(...), the cursor position is reset and it overwrites all text.
Why is this? I have looked everywhere but cannot find a reference this anywhere...
Your ds.tmp file has CR-LF as its line breaks. As a result, ${HOSTNAME} contains 1.2.3.4\r, not just 1.2.3.4.
Unix text files should just use LF as their line breaks. Use dos2unix to fix it.
Try this:
HOSTNAME=$(tr -d "\r" < ds.tmp)

Remove colour code special characters from bash file

I have a bash script that runs and outputs to a text file however the colour codes it uses are also included what i'd like to know is how to remove them from the file, ie
^[[38;1;32mHello^[[39m
^[[38;1;31mUser^[[39m
so I just want to be left with Hello and User, so something like sed -r "special characters" from file A save to file B
sed 's/\^\[\[[^m]*m//g'
remove (all) part of line starting with ^[[ until first m
Some like this:
awk '{sub(/\^\[\[38;1;[0-9][0-9]m/,x);sub(/\^\[\[39m/,x)}1'
Hello
User

Bash curl and variable in the middle of the url

I would need to read certain data using curl. I'm basically reading keywords from file
while read line
do
curl 'https://gdata.youtube.com/feeds/api/users/'"${line}"'/subscriptions?v=2&alt=json' \
> '/home/user/archive/'"$line"
done < textfile.txt
Anyway I haven't found a way to form the url to curl so it would work. I've tried like every possible single and double quoted versions. I've tried basically:
'...'"$line"'...'
"..."${line}"..."
'...'$line'...'
and so on.. Just name it and I'm pretty sure that I've tried it.
When I'm printing out the URL in the best case it will be formed as:
/subscriptions?v=2&alt=jsoneeds/api/users/KEYWORD FROM FILE
or something similar. If you know what could be the cause of this I would appreciate the information. Thanks!
It's not a quoting issue. The problem is that your keyword file is in DOS format -- that is, each line ends with carriage return & linefeed (\r\n) rather than just linefeed (\n). The carriage return is getting read into the line variable, and included in the URL. The giveaway is that when you echo it, it appears to print:
/subscriptions?v=2&alt=jsoneeds/api/users/KEYWORD FROM FILE"
but it's really printing:
https://gdata.youtube.com/feeds/api/users/KEYWORD FROM FILE
/subscriptions?v=2&alt=json
...with just a carriage return between them, so the second overwrites the first.
So what can you do about it? Here's a fairly easy way to trim the cr at the end of the line:
cr=$'\r'
while read line
do
line="${line%$cr}"
curl "https://gdata.youtube.com/feeds/api/users/${line}/subscriptions?v=2&alt=json" \
> "/home/user/archive/$line"
done < textfile.txt
Your current version should work, I think. More elegant is to use a single pair of double quotes around the whole URL with the variable in ${}:
"https://gdata.youtube.com/feeds/api/users/${line}/subscriptions?v=2&alt=json"
Just use it like this, should be sufficient enough:
curl "https://gdata.youtube.com/feeds/api/users/${line}/subscriptions?v=2&alt=json" > "/home/user/archive/${line}"
If your shell gives you issues with & just put \&, but it works fine for me without it.
If the data from the file can contain spaces and you have no objection to spaces in the file name in the /home/user/archive directory, then what you've got should be OK.
Given the contents of the rest of the URL, you could even just write:
while read line
do
curl "https://gdata.youtube.com/feeds/api/users/${line}/subscriptions?v=2&alt=json" \
> "/home/user/archive/${line}"
done < textfile.txt
where strictly the ${line} could be just $line in both places. This works because the strings are fixed and don't contain shell metacharacters.
Since you're code is close to this, but you claim that you're seeing the keywords from the file in the wrong place, maybe a little rewriting for ease of debugging is in order:
while read line
do
url="https://gdata.youtube.com/feeds/api/users/${line}/subscriptions?v=2&alt=json"
file="/home/user/archive/${line}"
curl "$url" > "$file"
done < textfile.txt
Since the strings may end up containing spaces, it seems (do you need to expand spaces to + in the URL?), the quotes around the variables are strongly recommended. You can now run the script with sh -x (or add a line set -x to the script) and see what the shell thinks it is doing as it is doing it.

Resources