xargs does not seem to separate parameters - bash

I am trying to use xargs to set secrets in github using the gh CLI.
Given I have an .env file with the following entries
SECRET1=djfjgdfkjg
SECRET2=jbnfdgjn
SECRET3=A line of text
And the sed command sed -r 's/^([A-Za-z0-9_]*)=(.*)$/\1 -b "\2"/g' ./.env produces the following output:
SECRET1 -b "djfjgdfkjg"
SECRET2 -b "jbnfdgjn"
SECRET3 -b "A line of text"
I am unsure as to why the command:
sed -r 's/^([A-Za-z0-9_]*)=(.*)$/\1 -b "\2"/g' test.env | xargs -I {} gh secret set {}
fails for each secret with the message secret name can only contain letters, numbers, and _
Manually running gh secret set SECRET1 -b "djfjgdfkjg" works without an error.
I'm guessing that the issue is that the first arg (secret name) is being passed the value SECRET1 -b "djfjgdfkjg" rather than just SECRET1 but I'm unsure how I can fix this?

After doing a bit more digging I discovered that the problem is the use of -I and that:
sed -rn 's/^[[:space:]]*([[:alpha:]][[:alnum:]_]*)=(.*)$/\1 -b "\2"/p' .env | xargs -n 3 gh secret set
resolves the problem.

Related

how to pipe multi commands to bash?

I want to check some file on the remote website.
Here is bash command to generate commands that calculate the file md5
[root]# head -n 3 zrcpathAll | awk '{print $3}' | xargs -I {} echo wget -q -O - -i {}e \| md5sum\;
wget -q -O - -i https://example.com/zrc/3d2f0e76e04444f4ec456ef9f11289ec.zrce | md5sum;
wget -q -O - -i https://example.com/zrc/e1bd7171263adb95fb6f732864ceb556.zrce | md5sum;
wget -q -O - -i https://example.com/zrc/5300b80d194f677226c4dc6e17ba3b85.zrce | md5sum;
Then I pipe the outputed commands to bash, but only the first command was executed.
[root]# head -n 3 zrcpathAll | awk '{print $3}' | xargs -I {} echo wget -q -O - -i {}e \| md5sum\; | bash -v
wget -q -O - -i https://example.com/zrc/3d2f0e76e04444f4ec456ef9f11289ec.zrce | md5sum;
3d2f0e76e04444f4ec456ef9f11289ec -
[root]#
Would you please try the following instead:
while read -r _ _ url _; do
wget -q -O - "$url"e | md5sum
done < <(head -n 3 zrcpathAll)
we should not put -i in front of "$url" here.
[Explanation about -i option]
Manpage of wget says:
-i file
--input-file=file
Read URLs from a local or external file. [snip]
If this function is used, no URLs need be present on the command line. [snip]
If the file is an external one, the document will be automatically treated as html if the Content-Type matches text/html.
Furthermore, the file's location will be implicitly used as base
href if none was specified.
where the file will contain line(s) of url such as:
https://example.com/zrc/3d2f0e76e04444f4ec456ef9f11289ec.zrce
https://example.com/zrc/e1bd7171263adb95fb6f732864ceb556.zrce
https://example.com/zrc/5300b80d194f677226c4dc6e17ba3b85.zrce
Whereas if we use the option as -i url, wget first
downloads the url as a file which contains the lines of urls
as above. In our case, the url is the target to download itself,
not the list of urls, wget causes an error: No URLs found in url.
Even if the wget fails, why the command outputs just one line, not
three lines as the result of md5sum?
This seems to be because the head command immediately flushes the remaining
lines when the piped subprocess fails.

Can't use gsutil cp with gitlab CI

I'm using gitlab runner on a mac mini server.
While using user named "runner" I manage to use this command:
gsutil ls -l gs://tests/ |grep staging | sort -k 2 | tail -n 3| head -n 2 |awk '{print $3}' | gsutil -m cp -I .
I manage to get the files, but while using the same command in gitlab-ci.yml like this:
stages:
- test
test:
stage: test
when: always
script:
- gsutil ls -l gs://tests/ |grep staging | sort -k 2 | tail -n 3| head -n 2 |awk '{print $3}' | gsutil -m cp -I .
I get the error:
bash: line 141: gsutil: command not found
Also I checked and gitlab runner is using the same user I used.
The gitlab runner is configured with shell executor.
Changing the command to hold the full path of gsutil didn't help either.
I added whoami to the gitlab-ci.yml and got the result of the same user "runner"
I managed to solve this issue by using this solution:
gcloud-command-not-found-while-installing-google-cloud-sdk
I included this 2 line into my gitlab-ci.yml before using the gsutil command.
source '[path-to-my-home]/google-cloud-sdk/path.bash.inc'
source '[path-to-my-home]/google-cloud-sdk/completion.bash.inc'

Environment variable overrides command

I set the EC2_IP_ADDRESS variable
$ export EC2_IP_ADDRESS="`docker run -it -v $PWD/infrastructure:/terraform -v $PWD/data:/data terraform sh -c "terraform init; terraform state show module.aws_ec2.aws_eip.aws_instance_eip" | grep public_ip | awk '{print $3}'`"
And then I'm trying to copy some files into the EC2 instance:
$ scp -i key.pem -r src/* ec2-user#$EC2_IP_ADDRESS:/home/ec2-user/src/
But the output is an error: : nodename nor servname provided, or not known
Output of $ echo "scp -i key.pem -r src/* ec2-user#$EC2_IP_ADDRESS:/home/ec2-user/src/"
:/home/ec2-user/src/c/* ec2-user#X.X.X.X
It seems that anything after the variable EC2_IP_ADDRESS goes to the beginning of the string, overriding the command.
Any ideas on how to fix this?
It seems the variable contains $'\r' at the end. Remove it with
EC2_IP_ADDRESS=${EC2_IP_ADDRESS%$'\r'}

Bash: Parse Urls from file, process them and then remove them from the file

I am trying to automate a procedure where the system will fetch the contents of a file (1 Url per line), use wget to grab the files from the site (https folder) and then remove the line from the file.
I have made several tries but the sed part (at the end) cannot understand the string (I tried escaping characters) and remove it from that file!
cat File
https://something.net/xxx/data/Folder1/
https://something.net/xxx/data/Folder2/
https://something.net/xxx/data/Folder3/
My line of code is:
cat File | xargs -n1 -I # bash -c 'wget -r -nd -l 1 -c -A rar,zip,7z,txt,jpg,iso,sfv,md5,pdf --no-parent --restrict-file-names=nocontrol --user=test --password=pass --no-check-certificate "#" -P /mnt/USB/ && sed -e 's|#||g' File'
It works up until the sed -e 's|#||g' File part..
Thanks in advance!
Dont use cat if it's posible. It's bad practice and can be problem with big files... You can change
cat File | xargs -n1 -I # bash -c
to
for siteUrl in $( < "File" ); do
It's be more correct and be simpler to use sed with double quotes... My variant:
scriptDir=$( dirname -- "$0" )
for siteUrl in $( < "$scriptDir/File.txt" )
do
if [[ -z "$siteUrl" ]]; then break; fi # break line if him empty
wget -r -nd -l 1 -c -A rar,zip,7z,txt,jpg,iso,sfv,md5,pdf --no-parent --restrict-file-names=nocontrol --user=test --password=pass --no-check-certificate "$siteUrl" -P /mnt/USB/ && sed -i "s|$siteUrl||g" "$scriptDir/File.txt"
done
#beliy answers looks good!
If you want a one-liner, you can do:
while read -r line; do \
wget -r -nd -l 1 -c -A rar,zip,7z,txt,jpg,iso,sfv,md5,pdf \
--no-parent --restrict-file-names=nocontrol --user=test \
--password=pass --no-check-certificate "$line" -P /mnt/USB/ \
&& sed -i -e '\|'"$line"'|d' "File.txt"; \
done < File.txt
EDIT:
You need to add a \ in front of the first pipe
I believe you just need to use double quotes after sed -e. Instead of:
'...&& sed -e 's|#||g' File'
you would need
'...&& sed -e '"'s|#||g'"' File'
I see what you trying to do, but I dont understand the sed command including pipes. Maybe some fancy format that I dont understand.
Anyway, I think the sed command should look like this...
sed -e 's/#//g'
This command will remove all # from the stream.
I hope this helps!

curl complex usage with pattern

I'm trying to get 2 files using curl based on some pattern but that doesn't seem to work:
Files:
SystemOut_15.04.01_21.12.36.log
SystemOut_15.04.01_15.54.05.log
curl -f -k -u "login:password" https://myserver/cgi-bin/logviewer/index.cgi?getlogfile=SystemOut_15.04.01_21.12.36.log'&'server=qwerty123.com'&'numlines=100000000'&'appenv=MBL%20-%20PROD'&'directory=/app/WAS/was85/profiles/node/logs/mbl-server1
I know there is -A key but it doesn't work since my file is inside the link.
How can I extract those 2 files using a pattern?
Did that myself. One curl gets the list of logs on the webpage. Another downloads those files.
The code looks like:
for file in $(curl -f -k -u "user:pwd" https://selfservice.pwj.com/cgi-bin/logviewer/index.cgi?listdirectory=/app/smx_client_mob/data/log'&'appenv=MBL%20-%20PROD'&'server=xshembl04pap.she.pwj.com | grep href | sed 's/.*href="//' | sed 's/".*//' | sed 's/javascript:getLog//g' | sed "s/['();]//g" | grep -i 'service' | grep '^[a-zA-Z].*'); do
curl -o $file -f -k -u "user:pwd" https://selfservice.pwj.com/cgi-bin/logviewer/index.cgi?getlogfile="$file"'&'server=xshembl04pap.she.pwj.com'&'numlines=100000000'&'appenv=MBL%20-%20PROD'&'directory=/app/smx_client_mob/data/log; done

Resources