This question already has answers here:
How to apply shell command to each line of a command output?
(9 answers)
Closed 5 months ago.
I have a simple ldapsearch bash script to return the user email when searched by ID. I made it take and argument as its input since at the time I only needed to run it once or twice.
I'm wondering can I adapt it and take input from a file like .txt and append the outputs to another file.
This is what i have:
#!/bin/bash
if [ "$1" = "" ]; then
echo "how to: searchID.sh <userID>"
exit 1
fi
ldapsearch -x -b '' -LLL -h ldaphost.com -p 255 uid=$1 mail >> outputs.txt
Instead of running it manually like:
./searchID.sh I0FT45
I want it to take input from a file with many ID's like:
I0001F
IGLFK7
I37462
I4593N
And run it for all those entries.
Any help is very much appreciated
if your usernames are xargs "safe" (no space, no quote) then you can do something like this:
xargs -I {} \
ldapsearch -x -b '' -LLL -h ldaphost.com -p 255 uid={} mail \
< file.in \
>> file.out
Related
This question already has answers here:
Looping through the content of a file in Bash
(16 answers)
Closed 1 year ago.
For a uni-assignment I need to write a while-loop for BASH-shell (on windows) that tries different passwords line by line from a given .txt-file and enter them to unzip a given .zip-Archive.
I know how to unzip archives and I know how to echo the .txt-file contents line by line, but can't figure out how to combine them:
#1
while read pw
do echo "$pw"
done < passwords.txt
#2
unzip -P $pw archive.zip
I would try to the following way:
while read -r pw
do
if unzip -P "$pw" archive.zip
then
break
fi
done < passwords.txt
You can read about read -r option here.
In bash there are several ways to accomplish this task, like the follow:
echo -n Please enter the password:
read -s PASSWORD
echo "Debug: your entered the passowrd \"${PASSWORD}\"."
Then call the unzip command...
I have a hash file containing several md5 hashes.
I want to create a bash script to curl virustotal to check if the hashes are known.
#!/bin/bash
for line in "hash.txt";
do
echo $line; curl -s -X GET --url 'https://www.virustotal.com/vtapi/v2/file/report?apikey=a54237df7c5c38d58d2240xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxcc0a0d7&resource='$line'';
done
but not working.
Could you help me please?
Better use a while loop. Your for loop would only run once, because bash interpret it as a value, not a file. Try this:
while read -r line; do
echo "$line"
curl -s -X GET --url "https://www.virustotal.com/vtapi/v2/file/report?apikey=a54237df7c5c38d58d2240xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxcc0a0d7&resource=$line"
done <"/path/to/hash.txt"
I have a json file, with entries containing urls (among other things), which i retrieve using curl.
I'd like to be able to run the loop several times at once to go faster, but also to have a limitation of the number of parallel curls, to avoid being kicked out by the distant server.
For now, my code is like
jq -r '.entries[] | select(.enabled != false) | .id,.unitUrl' $fileIndexFeed | \
while read unitId; do
read -r unitUrl
if ! in_array tabAnnoncesExistantesIds $unitId; then
fullUnitUrl="$unitUrlBase$unitUrl"
unitFile="$unitFileBase$unitId.json"
if [ ! -f $unitFile ]; then
curl -H "Authorization:$authMethod $encodedHeader" -X GET $fullUnitUrl -o $unitFile
fi
fi
done
If i use a simple & at the end of my curl, it will run lots of concurrent requests, and i could be kicked.
So, the question would be (i suppose) : how to know that a curl runned with an & has finished its job ? If i'm able to detect that, then i guess that i can test, increment and decrement a variable telling the number of running curls.
Thanks
Use GNU Parallel to control the number of parallel jobs. Either write your curl commands to a file so you can look at them and check them:
commands.txt
curl "something" "somehow" "toSomewhere"
curl "somethingelse" "someotherway" "toSomewhereElse"
Then, if you want no more than 8 jobs running at a time, run:
parallel -j 8 --eta -a commands.txt
Or you can just write the commands to GNU Parallel's stdin:
jq ... | while read ...; do
printf "curl ..."
done | parallel -j 8
Use a Bash function:
doit() {
unitId="$1"
unitUrl="$2"
if ! in_array tabAnnoncesExistantesIds $unitId; then
fullUnitUrl="$unitUrlBase$unitUrl"
unitFile="$unitFileBase$unitId.json"
if [ ! -f $unitFile ]; then
curl -H "Authorization:$authMethod $encodedHeader" -X GET $fullUnitUrl -o $unitFile
fi
fi
}
jq -r '.entries[] | select(.enabled != false) | .id,.unitUrl' $fileIndexFeed |
env_parallel -N2 doit
env_parallel will import the environment, so all shell variables are available.
I have written a code but I am having a problem to make the double loop in my bash script. This script should read all the files 1 by 1 in the given directory to upload but the value of "XYZ" changes for each file. Is there a way for me to make the code ask me to enter the "XYZ" every time it reads a new file to upload? (if possible with the name of the file read) like "please enter the XYZ value of 'read file's name'" I could not think of any possible ways of doing so. I also have the XYZ values listed in a file in a different directory so maybe can it be called like the do loop I did for the path? I might actually need to use both cases as well...
#!/bin/bash
FILES=/home/user/downloads/files/
for f in $FILES
do
curl -F pitch=9 -F Name='astn' -F
"path=#/home/user/downloads/files;$f" -F "pass 1234" -F "XYZ= 1.2" -
F time=30 -F outputFormat=json
"http://blablabla.com"
done
try following once.
#!/bin/bash
FILES=/home/user/downloads/files/
for f in $FILES
do
echo "Please enter the name variable value here:"
read Name
curl -F pitch=9 -F "$Name" -F
"path=#/home/user/downloads/files;$f" -F "pass 1234" -F "XYZ= 1.2" -
F time=30 -F outputFormat=json
"http://blablabla.com"
done
I have entered a read command inside loop so each time it will prompt user for a value, since you haven't provided more details about your requirement so I haven't tested it completely.
The problem was actually the argument. By changing it to:
-F Name="$Name"
solved the problem. Trying to link the argument such as only $Name or "$Name" causes a bad reception.
This question already has answers here:
Capturing output of find . -print0 into a bash array
(13 answers)
Closed 7 years ago.
I am currently a bash script that shall check some data. What I got so far is:
!/bin/bash
#!/bin/bash
find "./" -mindepth 1 -maxdepth 1 -type d -print0 | while IFS= read -r -d '' file; do
folder=${file##*/}
echo "Checking ${folder} for sanity..."
./makeconfig ${folder} | while read -r line; do
title=`echo $line | awk -F' ' '{print $2}'`
echo $title
done
done
Now what it currently does is: Search every directory in ./ and extract the folders name (thus: removing the ./ from the result of find). Then give it to a self-written tool, which will output some lines like this:
-t 1 -a 2
-t 3 -a 5
-t 7 -a 7
-t 9 -a 8
of which I gather the value behind -t via awk. This also works so far, the problem is, the outer while loop stops after the first iteration, thus checking only one folder. My guess is that the two read commands of the inner and outer loop are colliding somehow. The tool makeconfig definitiveley returns 0 (no error) always. I tried to debug it using sh -x script.sh but it does not show me anything I can deal with.
Can someone point me in the right direction here what is going wrong? If you need ANY further informations, I can give them to you. Ive written a quick mimicking program if you want to test the bash script here (also a script now, just echoing some stuff), just make it executable via chmod +x:
echo "-t 3 -a 4"
echo "-t 6 -a 1"
echo "-t 9 -a 5"
Just put this with the script in a folder and create some subfolders, that should do it to make it work (as much as it does).
Thanks in advance!
EDIT: This is NOT a duplicate as mentioned. The problem here are more the nested read commands than the print0 (maybe that has also something to do with it, but not entirely).
IFS= isn't setting the field separator to the null string (\0), but unsetting it entirely, so the entire output of the find command is being read at once. If you run it without the -print0 argument to find it'll be easier to work with in bash. Two other alternatives:
use xargs to run a shell script on each item found with that being the sole argument
use -exec to run the shell script on each item.