Create xml file with existing Folder and File names through shell script - bash

I'm very new to Shell Script. I need to create an XML file with Folder and files within the folder. The requirement is something like below.
For example: I have a folder called 'classes'
under this folder I have multiple files like '1.cls', '2.cls', '3.cls', etc.,
Similarly, I have other folders as well.
For example:
Folder name - 'Pages'
Files Under that folder name - '1.page', '2.page', '3.page' etc.,
Now my XML file should look something like below:
<types>
<members>1</members>
<members>2</members>
<members>3</members>
<name>classes</name>
</types>
<types>
<members>1</members>
<members>2</members>
<members>3</members>
<name>Pages</name>
</types>

Try the following in the directory where other required directories and their files are present:
#!/bin/bash
declare -r XML_FILE="Sample.xml"
[ -f ${XML_FILE} ] && : > ${XML_FILE}
for directory_name in $(ls -F . | grep '/' | sed 's|/||')
do
echo -e "<types>" >> ${XML_FILE}
dirfiles=$(ls -A ${directory_name})
if [ "${dirfiles}" ] ; then
for files in ${dirfiles}
do
echo -e "\t<members>${files/.*}</members>" >> ${XML_FILE}
done
fi
echo -e "\t<name>${directory_name}</name>" >> ${XML_FILE}
echo -e "</types>" >> ${XML_FILE}
done
Example
As per your example statements
mkdir -p classes Pages
touch classes/{1.cls,2.cls,3.cls}
touch Pages/{1.page,2.page,3.page}
Let the script be xmls.sh.
Execute the script: bash xmls.sh
View the output of Sample.xml: cat Sample.xml
<types>
<members>1</members>
<members>2</members>
<members>3</members>
<name>classes</name>
</types>
<types>
<members>1</members>
<members>2</members>
<members>3</members>
<name>Pages</name>
</types>
Now, Sample.xml file contains the above XML elements.

Related

How to add variable text in different files using for loop in bash script

I'm trying to add some text to the end of a few files.
I have made a file, where I have 5 servername. Each servername corresponds to a separate config file. (The path of these config files is not known).
I am using below code to get the file path,
MyCode:
#!/bin/bash
for i in $(cat serverlist-file | while read f; do find . -name "*$f*"; done);
do
echo $i
done
Output:
/data/servers/customer01/server01.cfg
/data/servers/customer01/server02.cfg
/data/servers/customer02/server03.cfg
/data/servers/customer03/server04.cfg
/data/servers/customer03/server05.cfg
I am using below code to get the list of servers,
MyCode:
#!/bin/bash
for j in $(cat serverlist-file);
do
echo $j
done
Output:
server01
server02
server03
server04
server05
Now I want to edit those config files and add text to it.
I am using below code to add the required text:
#!/bin/bash
for i in $(cat serverlist-file | while read f; do find . -name "*$f*"; done);
do
for j in $(cat serverlist-file);
do
sed -i -e "\$a\
this\ is\ a\ config\ file\nfor\ $j" $i
done
done
Expected Output:
/data/servers/customer01/server01.cfg
this is a config file
for server01
/data/servers/customer01/server02.cfg
this is a config file
for server02
/data/servers/customer02/server03.cfg
this is a config file
for server03
/data/servers/customer03/server04.cfg
this is a config file
for server04
/data/servers/customer03/server05.cfg
this is a config file
for server05
Edit for a reply to #ShawnMilo:
I am trying to bulk add some config to some nagios config files, but not to all server config files.
So, searching with find . -name '*.config' isn't going to work, because then all the config files will get edited.
I only want specific files to get edited, just the servers from the serverlist-file.
Nagios configs need to have the hostname of the server in them, like:
define service {
use generic-service
host_name server01
service_description SSH
contact_groups linux
check_command check_something
}
Seems like an odd requirement. What are you actually trying to do?
In any case, this will do what was requested:
$ find . -name '*.config' | while read x; do echo $x; cat $x; echo; done
./data/servers/customer02/server03.config
default stuff here
./data/servers/customer03/server05.config
default stuff here
./data/servers/customer03/server04.config
default stuff here
./data/servers/customer01/server01.config
default stuff here
./data/servers/customer01/server02.config
default stuff here
$ find . -name '*.config' | while read x; do name=$(basename $x); echo -e "this is a config file\nfor ${name%%.*}" >> $x; done
$ find . -name '*.config' | while read x; do echo $x; cat $x; echo; done
./data/servers/customer02/server03.config
default stuff here
this is a config file
for server03
./data/servers/customer03/server05.config
default stuff here
this is a config file
for server05
./data/servers/customer03/server04.config
default stuff here
this is a config file
for server04
./data/servers/customer01/server01.config
default stuff here
this is a config file
for server01
./data/servers/customer01/server02.config
default stuff here
this is a config file
for server02

Is way to escape angle brackets in a text block directed to a file?

Using a bash script as a working example
#!/bin/bash
echo "\
<global_preferences>
...
</global_preferences>" >> global_prefs.xml
I tried the following that (IMHO) should have worked but didn't
(
^<config^>
^</config^>
) > test.xml
The following does work but is a PITA as the xml file is long
echo ^<config^> > test.xml
echo ^</config^> >> test.xml

Shell script: Copy file and folder N times

I've two documents:
an .json
an folder with random content
where <transaction> is id+sequancial (id1, id2... idn)
I'd like to populate this structure (.json + folder) to n. I mean:
I'd like to have id1.json and id1 folder, an id2.json and id2 folder... idn.json and idn folder.
Is there anyway (shell script) to populate this content?
It would be something like:
for (i=0,i<n,i++) {
copy "id" file to "id+i" file
copy "id" folder to "id+i" folder
}
Any ideas?
Your shell syntax is off but after that, this should be trivial.
#!/bin/bash
for((i=0;i<$1;i++)); do
cp "id".json "id$i".json
cp -r "id" "id$i"
done
This expects the value of n as the sole argument to the script (which is visible inside the script in $1).
The C-style for((...)) loop is Bash only, and will not work with sh.
A proper production script would also check that it received the expected parameter in the expected format (a single positive number) but you will probably want to tackle such complications when you learn more.
Additionaly, here is a version working with sh:
#!/bin/sh
test -e id.json || { (>&2 echo "id.json not found") ; exit 1 ; }
{
seq 1 "$1" 2> /dev/null ||
(>&2 echo "usage: $0 transaction-count") && exit 1
} |
while read i
do
cp "id".json "id$i".json
cp -r "id" "id$i"
done

linux for loop two variables each time

I have several files in a directory and I want to run some linux packages on these files by every two of them, like ERR1045141_1 with ERR1045141_2 and ERR1045144_1 with ERR1045144_2 and so on. So I write a for loop for this but it is not working.
files:
ERR1045141_1.fastq.gz
ERR1045141_2.fastq.gz
ERR1045144_1.fastq.gz
ERR1045144_2.fastq.gz
ERR1045145_1.fastq.gz
ERR1045145_2.fastq.gz
ERR1045146_1.fastq.gz
ERR1045146_2.fastq.gz
ERR1045148_1.fastq.gz
ERR1045148_2.fastq.gz
ERR1045149_1.fastq.gz
ERR1045149_2.fastq.gz
ERR1045151_1.fastq.gz
ERR1045151_2.fastq.gz
ERR1045152_1.fastq.gz
ERR1045152_2.fastq.gz
ERR1045154_1.fastq.gz
ERR1045154_2.fastq.gz
codes:
files=ls
for (( i=0; i<${#files[#]} ; i+=2 )) ; do
echo "${files[i]}" "${files[i+1]}"
done
It did not work and I am not sure is the files=ls has something wrong.Or any better way to do it.please advise.
Try the following if you are sure about the existence of the second file:
for file1 in ERR*_1*
do
file2=`echo $file1 | sed 's/_1/_2/g'`
echo $file1 $file2
done
No, what you really want to do is to process all the 1 files, performing some action on it and its associated 2 file.
You can do that with something as simple as the for loop in this complete test program:
#!/usr/bin/env bash
doSomethingWith() {
echo "[$1] [$2]"
}
touch 'xERR1045141_1.fastq.gz' 'xERR1045141_2.fastq.gz'
touch 'xERR1045144_1.fastq.gz' 'xERR1045144_2.fastq.gz'
touch 'xERR1045145_1.fastq.gz' 'xERR1045145_2.fastq.gz'
touch 'xERR1045146_1.fastq.gz' 'xERR1045146_2.fastq.gz'
touch 'xERR1045148_1.fastq.gz' 'xERR1045148_2.fastq.gz'
touch 'xERR1045149_1.fastq.gz' 'xERR1045149_2.fastq.gz'
touch 'xERR1045151_1.fastq.gz' 'xERR1045151_2.fastq.gz'
touch 'xERR1045152_1.fastq.gz' 'xERR1045152_2.fastq.gz'
touch 'xERR1045154_1.fastq.gz' 'xERR1045154_2.fastq.gz'
touch 'xERR 45154_1.fastq.gz' 'xERR 45154_2.fastq.gz'
for file1 in xERR*_1.fastq.gz ; do
file2="${file1/_1/_2}"
doSomethingWith "${file1}" "${file2}"
done
rm -rf xERR*.fastq.gz
This program outputs:
[xERR1045141_1.fastq.gz] [xERR1045141_2.fastq.gz]
[xERR1045144_1.fastq.gz] [xERR1045144_2.fastq.gz]
[xERR1045145_1.fastq.gz] [xERR1045145_2.fastq.gz]
[xERR1045146_1.fastq.gz] [xERR1045146_2.fastq.gz]
[xERR1045148_1.fastq.gz] [xERR1045148_2.fastq.gz]
[xERR1045149_1.fastq.gz] [xERR1045149_2.fastq.gz]
[xERR1045151_1.fastq.gz] [xERR1045151_2.fastq.gz]
[xERR1045152_1.fastq.gz] [xERR1045152_2.fastq.gz]
[xERR1045154_1.fastq.gz] [xERR1045154_2.fastq.gz]
[xERR 45154_1.fastq.gz] [xERR 45154_2.fastq.gz]
to show that the names are being handled correctly.
Note that I've named the files xERR* so as not to clash with your own files. You should adjust the loop to handle your own files once you're satisfied it will work okay.
And, just as an aside, if you don't want to do anything except for those cases where both files exist, you can simply replace the "action" line with something like:
[[ -f "${file2}" ]] && doSomethingWith "${file1}" "${file2}"
This will bypass those where the 2 file is not a regular file.

Issue on concatenating files shellscipt

Sorry, I'm from Brazil and my english is not fluent.
I wanna concatenate 20 files using a shellscript through cat command. However when I run it from a file, all content of files are showed on the screen.
When I run it directly from terminal, works perfectly.
That's my code above:
#!/usr/bin/ksh
set -x -a
. /PROD/INCLUDE/include.prod
DATE=`date +'%Y%m%d%H%M%S'`
FINAL_NAME=$1
# check if all paremeters are passed
if [ -z $FINAL_NAME ]; then
echo "Please pass the final name as parameter"
exit 1
fi
# concatenate files
cat $DIRFILE/AI6LM760_AI6_CF2_SLOTP01* $DIRFILE/AI6LM761_AI6_CF2_SLOTP02* $DIRFILE/AI6LM763_AI6_CF2_SLOTP04* \
$DIRFILE/AI6LM764_AI6_CF2_SLOTP05* $DIRFILE/AI6LM765_AI6_CF2_SLOTP06* $DIRFILE/AI6LM766_AI6_CF2_SLOTP07* \
$DIRFILE/AI6LM767_AI6_CF2_SLOTP08* $DIRFILE/AI6LM768_AI6_CF2_SLOTP09* $DIRFILE/AI6LM769_AI6_CF2_SLOTP10* \
$DIRFILE/AI6LM770_AI6_CF2_SLOTP11* $DIRFILE/AI6LM771_AI6_CF2_SLOTP12* $DIRFILE/AI6LM772_AI6_CF2_SLOTP13* \
$DIRFILE/AI6LM773_AI6_CF2_SLOTP14* $DIRFILE/AI6LM774_AI6_CF2_SLOTP15* $DIRFILE/AI6LM775_AI6_CF2_SLOTP16* \
$DIRFILE/AI6LM776_AI6_CF2_SLOTP17* $DIRFILE/AI6LM777_AI6_CF2_SLOTP18* $DIRFILE/AI6LM778_AI6_CF2_SLOTP19* \
$DIRFILE/AI6LM779_AI6_CF2_SLOTP20* > CF2_FINAL_TEMP
mv $DIRFILE/CF2_FINAL_TEMP $DIRFILE/$FINAL_NAME
I solved the problem putting the cat block inside a function, and redirecting stdout to the final file.
Ex:
concatenate()

Resources