Error with array and sed WP config replacement - bash

newwpuser=$cpuser"_"$wpuser
newwpdb=$cpuser"_"$wpdb
wpdb=($(find . -name "wp-config.php" -print0 | xargs -0 -r grep -e "DB_NAME" | cut -d \' -f 4))
wpuser=($(find . -name "wp-config.php" -print0 | xargs -0 -r grep -e "DB_USER" | cut -d \' -f 4))
wpconfigchanges=($(find . -name wp-config.php -type f))
for i in "${wpconfigchanges[#]}"; do -exec sed -i -e "/DB_USER/s/'$wpuser'/'$newwpuser'/" | -exec sed -i -e "/DB_NAME/s/'$wpdb'/'$newwpuser'/"; done
I am trying to run the above in order to find all wordpress configs and append the db user and dbname with cpuser_
However - I get the following error;
./test.sh: line 85: -exec: command not found
./test.sh: line 85: -exec: command not found
Have I inputted the exec commands wrong?
CPUser is inputted on execution

Related

execute an if statement on every folder

I have for example 3 files (it could 1 or it could be 30) like this :
name_date1.tgz
name_date2.tgz
name_date3.tgz
When extracted it will look like :
name_date1/data/info/
name_date2/data/info/
name_date3/data/info/
Here how it looks inside each folder:
name_date1/data/info/
you.log
you.log.1.gz
you.log.2.gz
you.log.3.gz
name_date2/data/info/
you.log
name_date3/data/info/
you.log
you.log.1.gz
you.log.2.gz
What I want to do is concatenate all you file from each folder and concatenate one more time all the concatenated one to one single file.
1st step: extract all the folder
for a in *.tgz
do
a_dir=${a%.tgz}
mkdir $a_dir 2>/dev/null
tar -xvzf $a -C $a_dir >/dev/null
done
2nd step: executing an if statement on each folder available and cat everything
myarray=(`find */data/info/ -maxdepth 1 -name "you.log.*.gz"`)
ls -d */ | xargs -I {} bash -c "cd '{}' &&
if [ ${#myarray[#]} -gt 0 ];
then
find data/info -name "you.log.*.gz" -print0 | sort -z -rn -t. -k4 | xargs -0 zcat | cat -
data/info/you.log > youfull1.log
else
cat - data/info/you.log > youfull1.log
fi "
cat */youfull1.log > youfull.log
My issue when I put multiple name_date*.tgzit gives me this error:
gzip: stdin: unexpected end of file
With the error, I still have all my files concatenated, but why error message ?
But when I put only one .tgz file then I don't have any issue regardless the number you file.
any suggestion please ?
Try something simpler. No need for myarray. Pass files one at a time as they are inputted and decide what to do with them one at a time. Try:
find */data/info -type f -maxdepth 1 -name "you.log*" -print0 |
sort -z |
xargs -0 -n1 bash -c '
if [[ "${1##*.}" == "gz" ]]; then
zcat "$1";
else
cat "$1";
fi
' --
If you have to iterate over directories, don't use ls, still use find.
find . -maxdepth 1 -type d -name 'name_date*' -print0 |
sort -z |
while IFS= read -r -d '' dir; do
cat "$dir"/data/info/you.log
find "$dir"/data/info -type f -maxdepth 1 -name 'you.log.*.gz' -print0 |
sort -z -t'.' -n -k3 |
xargs -r -0 zcat
done
or (if you have to) with xargs, which should give you the idea how it's used:
find . -maxdepth 1 -type d -name 'name_date*' -print0 |
sort -z |
xargs -0 -n1 bash -c '
cat "$1"/data/info/you.log
find "$1"/data/info -type f -maxdepth 1 -name "you.log.*.gz" -print0 |
sort -z -t"." -n -k3 |
xargs -r -0 zcat
' --
Use -t option with xargs to see what it's doing.

Why does my xargs command with a pipe work only for a single file, but not multiple?

I am trying to pipe a few commands in a row; it works with a single file, but gives me an error once I try it on multiple files at once.
On a single file in my working folder:
find . -type f -iname "summary.5runs.*" -print0 | xargs -0 cut -f1-2 | head -n 2
#It works
Now I want to scan all files with a certain prefix/suffix in the name in all subdirectories of my working folder, then write the results to text file
find . -type f -iname "ww.*.out.txt" -print0 | xargs -0 cut -f3-5 | head -n 42 > summary.5runs.txt
#Error: xargs: cut: terminated by signal 13
I guess my problem is to reiterate through multiple files, but I am not sure how to do it.
Your final head stops after 42 lines of total output, but you want it to operate per file. You could fudge around with a subshell in xargs:
xargs -0 -I{} bash -c 'cut -f3-5 "$1" | head -n 42' _ {} > summary.5runs.txt
or you could make it part of an -exec action:
find . -type f -iname "ww.*.out.txt" \
-exec bash -c 'cut -f3-5 "$1" | head -n 42' _ {} \; > summary.5runs.txt
Alternatively, you could loop over all the files in the subshell so you have to spawn just one:
find . -type f -iname "ww.*.out.txt" \
-exec bash -c 'for f; do cut -f3-5 "$f" | head -n 42; done' _ {} + \
> summary.5runs.txt
Notice the {} + instead of {} \;.

Append line break to every cat in pipe

I have the following pipeline:
find /my/place -name 'test_*_blub' | xargs cat
While this works fine, I also want to have all file content terminated by a line break (\n).
Could not yet figure out how to append the newline.
To print a linebreak \n after each file content - use one of the following approaches:
1) running shell commands
find /my/place -name 'test_*_blub' | xargs -I % sh -c 'cat %; echo "";'
sh -c 'cat %; echo "";' - multiple commands executed one-by-one
2) with -exec action:
find /my/place -name 'test_*_blub' -exec cat {} \; -exec echo "" \;
3) with -printf action:
find /my/place -name 'test_*_blub' -exec cat {} \; -printf "\n"
Figured out an easy way:
find /my/place -name 'test_*_blub' | xargs cat | xargs -I '{}' echo '{}'

How to combine these 2 commands into single line command

I have 2 useful bash command below, but i want to combine it together.
Is it possible to do ?
find "$1" -type f -print0 | xargs -0 sha1sum -b
find "$1" -type f ! -iname '*thumbs.db*' -print0 | xargs -0 stat -c "%y %s %n"
If you want to write it in one line, you can just use "&" to combine the commands. Maybe this is what you meant:
find "$1" -type f -print0 | xargs -0 sha1sum -b & find "$1" -type f ! -iname '*thumbs.db*' -print0 | xargs -0 stat -c "%y %s %n"

Sed does not replace in jenkins when using variable

Im running shell scripts in jenkins as below (Change to groovy format). But when I try to run in jenkins it did not replace the value for port_https. However, when I tried to run manually is change the value as exported.
sh "find ./ -type f -name '*.yml' -exec sed -i -e \"s/url_protocol: http\$/url_protocol: https/g\" {} \\ ;"sh "find ./ -type f -name '*.yml' -exec sed -i -e \'s/protocol: \"http\"\$/protocol: \"https\"/g\' {} \\;"sh "export abc_port=\"\$(find ./pdf/myInventory/dyn_conf/group_vars/abc1/vars.yml -exec grep abc_solution_portbase: {} \\; | cut -d: -f2)\" ; export port_https=\$((\$abc_port + 81)) ; export port=\"\$(find ./pdf/myInventory/dyn_conf/group_vars/def1/vars.yml -exec sh -c 'grep -A 3 -e def_nei_abc_host {} | grep -e port' \\; | cut -d: -f2)\" ; find ./pdf/myInventory/dyn_conf/group_vars/def1/vars.yml -exec sed -i -e 's/port:\${port}/port: \"\"\${port_https}\"\"/g' {} \\;"
The console output shows the correct exported value for each parameter.
Update:
The line that have the problem
find ./pdf/myInventory/dyn_conf/group_vars/def1/vars.yml -exec sed -i -e 's/port: "24080"$/port: 24081/g' '{}' ';'
Im trying to make the port become
port: "24081"
As you said, I change the single quote become double quotes as below
find ./pdf/myInventory/dyn_conf/group_vars/def1/vars.yml -exec sed -i -e \"s/port:\${port}\$/port: \"\${port_https}\"/g\" {} \\;"
But when I try to run using jenkins, the double quotes for port does not recognize. and become
port: 24081

Resources