I am trying to iterate through a list to curl each one, this ultimately is to kick of a list of Jenkins jobs.
so i have a text file which contents is
ApplianceInsurance-Tests
BicycleInsurance-Tests
Breakdown-Tests
BridgingLoans-Tests
Broadband-Tests
Business-Loans
BusinessElectric-Tests
BusinessGas-Tests
and i am trying to create a loop in which i fire a curl command for each line in the txt file
for fn in `cat jenkins-lists.txt`; do "curl -X POST 'http://user:key#x.x.x.xxx:8080/job/$fn/build"; done
but i keep getting a error - No such file or directory.
Getting a little confused
Your do-done body is quoted wrong. It should be:
curl -X POST "http://user:key#x.x.x.xxx:8080/job/$fn/build"
I'd also recommend:
while read -r fn; do
curl -X POST "http://user:key#x.x.x.xxx:8080/job/$fn/build"
done < jenkins-list.txt
instead of for fn in $(anything); do .... With the second way you don't
have to worry about inadvertent globbing and the jenkins-list file may
get nicely buffered instead of needing to be read all into memory at once (not that it matters for such a small file but why not have a technique that works well more or less regardless of file size?).
If the error had come from curl, it would probably have been html-formatted. The only way I can reproduce the error you describe is by cat-ing a non-existent file.
Double check the name of the jenkins-lists.txt file, and make sure your script is running in the same directory as the file. Or use an absolute path to the file.
Related
I have text a file with a list of s3 objects, in the form of:
prefix_x/prefix_y/file_name_1
prefix_w/prefix_z/file_name_88
etc...
I wrote a bash script to delete all of these objects, as follows:
#! /bin/bash
LIST_OF_PATHS=$1
while read FILE_PATH; do
aws s3 rm s3://bucket-name/$FILE_PATH
done < $LIST_OF_PATHS
The script doesn't seem to delete the objects (they appear in the UI and in the terminal when lsing them with the CLI).
Further details and things I've already tried:
deleting the objects manually with similar command in the CLI - it works.
adding ls command to the loop provides no output, whereas typing this command manually on the very same files does give output.
adding sleep 0.1 to each iteration of the loop didn't help either.
of course the script runs - I see the output: delete: s3://bucket-name/prefix_x/prefix_y/file_name_1, but the file doesn't actually get deleted.
running a simpler bash script with the same command and a specific file name (not inside a loop) does delete successfully.
What might be the problem?
Solved!
The issue was that in a script, bash expects the newline char to be '\n', but my input file contained '\r' at the end of each line. More on this can be found here:
https://superuser.com/questions/489180/remove-r-from-echoing-out-in-bash-script/489191
Many thanks to #Barmar, whose comment helped me see this and debug the issue.
I simply changed input file itself, and the script ran perfectly as it was.
Check your AWS permissions on the S3 service.
Check your AWS command line tool configuration.
Check your s3 bucket configuration.
Run your AWS s3 command without using a loop, see if the command is run successfully, before using a loop
I have a file containing a list of pdf files.
this is a test.pdf
this is a/another test.pdf
test number 5_ example.pdf
...
I know that doing:
curl -O "https://report2018/this is another test.pdf"
will download the file.
So the scenario is use curl in bash script to download all the pdf's in the file one by one when the beginning of the URL should be: "https://report2018/" .
So a complete URL will be https://report2018/+PDF_NAME
Any ideas or suggestions how to do it in bash script?
It is a pretty basic script actually...
You can break your problem in two pieces:
Basic usage of bash, in general;
How to cycle a file.
Something like this will suffice:
#!/bin/bash
exec 3<file.list # Put the file in a file descriptor
while read -r line <&3; do # Read line by line the file descriptor
curl -sk "https://report2018/${line}" > "${line}" # make curl and save in file
done
exec 3>&- # Close file descriptor
Obviously you have to change the curl with your needs (E.G. User Agent and/or authentication).
Note that with > ${line}" after curl, you will save the file with the same name read by file.list.
I hope this helps you, and please, next time, use a search engine first.
I am trying to create a (my first) bash script, but I need a little help. I have the following:
#!/bin/bash
echo "Write a LaTeX equation:"
read -e TeXFormula
URIEncoded = node -p "encodeURIComponent('$(sed "s/'/\\\'/g" <<<"$TeXFormula")')"
curl http://latex.codecogs.com/gif.latex?$URIEncoded -o /Users/casparjespersen/Desktop/notetex.gif | pbcopy
I want it to:
Require user input (LaTeX equation)
URIEncode user input (the top result in Google was using node.js, but anything will do..)
Perform a cURL call to this website converting equation to a GIF image
Copy the image to the placeholder, so I can paste it to a note taking app like OneNote, Word, etc.
My script is malfunctioning in the following:
URIEncoded is undefined, so there is something wrong with my variable definition.
When I copy using pbcopy the encrypted text content of the image is copied, and not the actual image. Is there a workaround for this? Otherwise, the script could automatically open the image and I could manually Cmd + C the content.
URIEncoded is undefined, so there is something wrong with my variable
definition.
The line should read
URIEncoded=$(node -p "encodeURIComponent('$(sed "s/'/\\\'/g" <<<"$TeXFormula")')")
without spaces around the = sign, and using the $() construct to actually perform the command, otherwise, the text of the command would be assigned to the variable.
When I copy using pbcopy the encrypted text content of the
image is copied, and not the actual image. Is there a workaround for
this? Otherwise, the script could automatically open the image and I
could manually Cmd + C the content.
pbcopy takes input from stdin but you are telling curl to write the output to a file rather than stdout. Try simply
curl http://latex.codecogs.com/gif.latex?$URIEncoded | pbcopy
or, for the second option you describe
curl http://latex.codecogs.com/gif.latex?$URIEncoded -o /Users/casparjespersen/Desktop/notetex.gif && open $_
Due to processes out of my control I need run multiple SH files which contains lengthy CURL commands. Problem is that whichever process created these commands seems to have included one line of whitespace at the very end. If I call it as is - it fails. If I physically open the file and hit backspace on the first full empty line and save the file - it works perfectly.
Any way to put some kind of command into the SH file so that it removes any unnecessary stuff?
More info would be helpful, but the following might work:
If you need to put something into each of the files that contain the curl commands as you mention, you could try putting exit as the last line of the curl script (also depends on how you're calling the 'curl files'
exit
If you can run a separate script against the files that have a blank line, perhaps sed the blank lines away?
sed -i s/^\s$// $fileWithLineOfSpaces
edit:
Or (after thinking about it), perhaps simply delete the last line of the file....
sed -i '$d' $file
I am trying to call a script from within another script. The idea is that the program should take in email that is sent to it directly from unix mail as stdin, and then parse some stuff out and send it to a new script.
I am having trouble reaching the new script. However, this problem only occurs when the script is accepting the email directly. If I cat a file into it, there is no problem and it finds the new script.
IE: if i have a test file called "email.txt" and i do the command:
cat email.txt | ./receiveEmail.sh
then the script calling works fine.
but if receiveEmail.sh receives the email directly, it fails to call the new script. I know this is the point where it fails because I get logs that the script is working all the way up to where it tries to call the new script.
--------receiveEmail.sh----------
#!/bin/bash
###do some stuff to parse the stdin coming in and get variable $subject and $body
issue=`. /home/dlaf/bin/makeissue.sh` ->>>> this is the line that doesn't seem to work when the input is straight from the email rather than from a txt file i send it.
I am confused why. I think it might be because I am missing some part of the path? Maybe the email coming in has no idea what my full path actually is? Im not sure though because when I type in to the command line echo $LD_LIBRARY_PATH I just get a blank line, so I assume its not even set so I don't know how this could be a problem
When saving output to a variable with Bash, I usually do this
read issue < <(~/bin/makeissue.sh)
echo "$issue"
If the output is multiple lines you can do this
read -d'' issue < <(~/bin/makeissue.sh)
echo "$issue"
or this
mapfile issue < <(~/bin/makeissue.sh)
echo "${issue[#]}"