Merging jMeter result files - jmeter

I'm trying to use the jMeter merger to merge the result files (http://jmeter-plugins.org/wiki/MergeResults/), but it seems the plug-in is restricted to 4 files to be merged.
Any way I can merge more files (>100)?
The structure of the files seems simple enough (https://wiki.apache.org/jmeter/JtlFiles) so I am about to break bash and write my own, but wanted to know if something hadn't already been written.

Create a jMeter output file based on your project:
jmeter -n -t ./project.jmx -l testresult.jtl
This will load the file project.jmx, run the test and save the result to testresult.jtl.
Once you have many of these result files, you can merge them using the following script:
#!/bin/bash
echo "Combines all results from files called testresult*.jtl into one file called merged.jtl"
echo "If merged.jtl exists, it will be overridden"
cat testresult*.jtl > merged.jtl
# Remove boundaries between tests
sed 's_<\/testResults>__g' merged.jtl > /tmp/sedmerged1
sed 's_<?xml version=\"1.0\" encoding=\"UTF-8\"?>__g' /tmp/sedmerged1 > /tmp/sedmerged2
sed 's_<testResults version=\"1.2\">__g' /tmp/sedmerged2 > /tmp/sedmerged3
# Add wrappers
echo "</testResults>" >> /tmp/sedmerged3
sed '1i <?xml version="1.0" encoding="UTF-8"?><testResults version="1.2">' /tmp/sedmerged3 > merged.jtl
The script will create a file called merged.jtl.
Done.

Related

How do I append the contents of numerous files to a single file?

I have 44 RTF files (file1.rtf, file2.rtf, ..., file44.rtf) and I need to combine them all into a single file (either file1.rtf or a new file altogether).
I understand that the way to combine the contents of two files is like this:
cat file2.rtf >> file1.rtf
This example appends the contents of file2.rtf into file1.rtf.
I also understand that I need to iterate through the files, which I can achieve like this:
for file in *.rtf;
do
# do something;
done
As such, I have this which appears to do the job:
#!/bin/bash
for file in *.rtf;
do
cat $file >> "../combined.rtf";
echo "File $file added."
done
But there is an issue: when I run cat ../combined.rtf I see the combined documents but when I run open ../combined.rtf it only shows me the contents of file1.rtf (in LibreOffice Writer).
Where have I gone wrong?

Manage Unix rights reading dynamic field in a csv file in bash

i'm currently stuck with a bash script that should be able to manage permission on files and directories at the end of the processing part.
Actually i have 4 components:
the main script which source .conf file and libs (.sh), process things, and call a function "ApplyPermissionFromCSVFile" with a .csv file as argument at the end, to ensure that rights are correctly set. This function should handle the job for managing permission on files
a script called "permission_lib.sh" which contains several functions, including the "ApplyPermissionFromCSVFile" one. This script is SOURCED at the begining of main script
a .conf file which contains some path defined as variables which is SOURCED at the beginning of main script
a .csv file containing paths (including "dynamic path", which refer to variables defined in conf file) for files and directories which is READ by the ApplyPermissionFromCSVFile function
At the moment, the main script runs correctly, is able to source both conf file and lib file but when i put some debug point inside the "ApplyPermissionFromCSVFile", it appears that "dynamic path" is not interpreted by bash.
Extract of Main Script:
#########################################
includes
##################################################
# this section can _almost_ be copied as-is ;-)
nameOfThisScript=$(basename "${BASH_SOURCE[0]}")
directoryOfThisScript="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
configFile="$directoryOfThisScript/$nameOfThisScript.conf"
functionsFile="$directoryOfThisScript/safeScriptsFunctions.sh"
permissionLib="$directoryOfThisScript/permission_lib.sh"
permissionFile="$directoryOfThisScript/$nameOfThisScript.permissionFile.csv"
for fileToSource in "$configFile" "$functionsFile" "$permissionLib"; do
source "$fileToSource" 2>/dev/null || {
echo "File '$fileToSource' not found"
exit 1
}
done
#Main Loop to read CSV File is called in permissionLib.sh
ApplyPermissionFromCSVFile $permissionFile
Extract of conf file (real filename replaced for exemple):
totovariable="/usr/local"
tatavariable="$totovariable/bin"
Extract of csv file:
$totovariable;someuser;somegroup;0600
$tatavariable;someuser;somegroup;0600
Extract of permission lib file:
function ApplyPermissionFromCSVFile {
local csvFileName="$1"
local fieldNumberFileName=1
local fieldNumberOwner=2
local fieldNumberGroupOwner=3
local fieldNumberPermissions=4
while read csvLine
do
fileName=$(getFieldFromCsvLine "$fieldNumberFileName" "$csvLine")
fileOwner=$(getFieldFromCsvLine "$fieldNumberOwner" "$csvLine")
fileGroupOwner=$(getFieldFromCsvLine "$fieldNumberGroupOwner" "$csvLine")
filePermissions=$(getFieldFromCsvLine "$fieldNumberPermissions" "$csvLine")
permissionArray[0,0]="$fileName|$fileOwner|$fileGroupOwner|$filePermissions"
echo "${permissionArray[0,0]}"
done < "$csvFileName"
}
getFieldFromCsvLine() {
csvFieldSeparator=';'
fieldNumber="$1"
csvLine="$2"
echo "$csvLine" | cut -d "$csvFieldSeparator" -f "$fieldNumber"
}
Don't bother for the fact that the loop overwrite value at each iterations, it's not the purpose here (but optionnal answer :p).
Which output results:
$totovariable|someuser|somegroup|0600
$tatavariable|someuser|somegroup|0600
Changing owner to someuser:somegroup for file $tatavariable
chown: cannot access '$tatavariable': No such file or directory
Changing permissions to 0600 for file $tatavariable
chmod: cannot access '$tatavariable': No such file or directory
After some investigations an research, it seems normal as:
conf file is SOURCED (by main script)
lib file is SOURCED (by main script)
csv file is not SOURCED but READ (by function in lib). So bash consider the variables contents as "pure-string", and not variables
The issue, is that i can't clearly see how and where i should replace the "pure-string" variable by its value (defined in .conf file and sourced by main script): at the main script level ? at the lib level with global variables ?
Actual solutions that i've found:
sed substitute
use eval
Any help could be appreciated.
Solution used:
fileName=eval echo "$fileName"
As the pathes in .csv file may contain "$" Symbol, Bash parameter expansion will not work in every case.
Exemple:
with the following csv content:
$tatavariable;someuser;somegroup;0600
$totovariable/thisotherfile.txt;someuser;somegroup;0660
$totovariable;someuser;somegroup;0600
/home/someuser/lolzy.txt;someuser;somegroup;0666
And the following conf file:
totovariable="/home/someuser/fack"
tatavariable="$totovariable/thisfile.txt"
The following bash code (based on eval which is not highly recommended in most case) will work on every case (containing $ symbol or not):
#!/bin/bash
#########################
# Function
#########################
getFieldFromCsvLine() {
local csvFieldSeparator=';'
local fieldNumber="$1"
local csvLine="$2"
echo "$csvLine" | cut -d "$csvFieldSeparator" -f "$fieldNumber"
}
#########################
#Core Script
#########################
source configFile.conf
csvFileName="permissionFile.csv"
fieldNumberFileName=1
fieldNumberOwner=2
fieldNumberGroupOwner=3
fieldNumberPermissions=4
while read csvLine
do
fileName=$(getFieldFromCsvLine "$fieldNumberFileName" "$csvLine")
fileOwner=$(getFieldFromCsvLine "$fieldNumberOwner" "$csvLine")
fileGroupOwner=$(getFieldFromCsvLine "$fieldNumberGroupOwner" "$csvLine")
filePermissions=$(getFieldFromCsvLine "$fieldNumberPermissions" "$csvLine")
#Managing Variables used as 'Dynamic path'
fileName=$(eval echo "$fileName")
echo "Content of \$fileName is $fileName"
done < "$csvFileName"
Results:
[someuser#SAFEsandbox:~]$ ./simpletest.sh
Content of $fileName is /home/someuser/fack/thisfile.txt
Content of $fileName is /home/someuser/fack/thisotherfile.txt
Content of $fileName is /home/someuser/fack
Content of $fileName is /home/someuser/lolzy.txt
Whereas the following bash code (base on bash parameters expansion) will throw errors:
#!/bin/bash
#########################
# Function
#########################
getFieldFromCsvLine() {
local csvFieldSeparator=';'
local fieldNumber="$1"
local csvLine="$2"
echo "$csvLine" | cut -d "$csvFieldSeparator" -f "$fieldNumber"
}
#########################
#Core Script
#########################
source configFile.conf
csvFileName="permissionFile.csv"
fieldNumberFileName=1
fieldNumberOwner=2
fieldNumberGroupOwner=3
fieldNumberPermissions=4
while read csvLine
do
fileName=$(getFieldFromCsvLine "$fieldNumberFileName" "$csvLine")
fileOwner=$(getFieldFromCsvLine "$fieldNumberOwner" "$csvLine")
fileGroupOwner=$(getFieldFromCsvLine "$fieldNumberGroupOwner" "$csvLine")
filePermissions=$(getFieldFromCsvLine "$fieldNumberPermissions" "$csvLine")
#Managing Variables used as 'Dynamic path'
fileName=${!fileName}
echo "Content of \$fileName is $fileName"
done < "$csvFileName"
Behaviour example when .csv contains $ symbol:
[someuser#SAFEsandbox:~]$ ./simpletest.sh
./simpletest.sh: line 35: $tatavariable: bad substitution
Behaviour example when you remove the $ symbol in .csv file, but there is still incremental path notions in it:
[someuser#SAFEsandbox:~]$ ./simpletest.sh
Content of $fileName is /home/someuser/fack/thisfile.txt
./simpletest.sh: line 35: totovariable/thisotherfile.txt: bad substitution
Using bash parameter expansion (which is recommended in most case) is not easy here, because it forces to manage 2 cases in the script:
path containing a leading $ or serveral $ (contatenated variables paths)
path not containing a leading $ or serveral $ (contatenated variables paths)
Regards

Listing of files in directory in shell script

I am having list of xml files in a folder like - data0.xml, data1.xml, data2.xml, ...data99.xml
I have to read the contents of these files for further processing. Currently I am using for loop like below
for xmlentry in `ls -v *.xml
do
execute_loop $xmlemtry
done
This is executing fine for all xml's file in sequence ,
But I wanted to know if I want to force FOR loop to start from data10.xml and proceed till data99.xml
For loop shoud start from data10.xml, data11.xml .... data99.xml
How to do something like this in shell scripting, better if I could
control the start of loop with a variable
You can construct the name of the files and loop through them. In you specific example, something like this could work:
first=10
last=99
for i in $(seq "$first" "$last")
do
xmlfile="data${i}.xml"
execute_loop "$xmlfile"
done

for loop calling two different files (same name, different extension) across several files

I have several files. I need to run a perl script calling all the files in a folder, simultaneously calling one file with extension .pep and another that is a similar file and matches the same name but different extension, .pep.nuc (like Oh01.pep and Oh001.pep.nuc)
I have this script so far, but I am missing something of course.
for file *.pep; do ./script.pl *.pep *.pep.nuc > "${file%.*}.nucleo"; done
This should do it:
for file in *.pep; do
./script.pl "$file" "${file}.nuc" > "${file%.pep}.nucleo";
done

Bash script to obtain the newest file X in a folder and create a new variable called X+1

I am trying to create a loop in Bash script for a series of data migrations:
At the beginning of every step, the script should get the name of the newest file in a folder
called "migrationfiles/ and store it in the variable "migbefore" and create a new variable called "migbefore+1":
Example: if the "migrationfiles/" folder contains the following files:
migration.pickle1 migration.pickle2 migration.pickle3
The variable "migbefore" and migafter should have the following value:
migbefore=migration.pickle3
migafter=migration.pickle4
At the end of every step, the function "metl", which is in charge of making the data migration, uses the file "migbefore" to load the data and creates 1 new file called "migafter" and stores it in the "migrationfiles/" folder, so in this case, the new file created will be called:
"migration.pickle4"
The code I pretend using is the following:
#!/bin/bash
migbefore=0
migafter=0
for y in testappend/*
for x in migrationfiles/*
do
migbefore=migration.pickle(oldest)
migafter=migbefore+1
done
do
metl -m migrationfiles/"${migbefore}"
-t migrationfiles/"${migafter}"
-s "${y}"
config3.yml
done
Does anyone know how I could make the first loop (The one that searches for the newest file in the "migrationfiles/" folder) and then assigns the name of the variable "migafter" as "migbefore+1"?
I think this might do what you want.
#!/bin/bash
count=0
prefix=migration.pickle
migbefore=$prefix$((count++))
migafter=$prefix$((count++))
for y in testappend/*; do
echo metl -m migrationfiles/"${migbefore}" \
-t migrationfiles/"${migafter}" \
-s "${y}" \
config3.yml
migbefore=$migafter
migafter=$prefix$((count++))
done
Copy with Numbered Backups
It's really hard to tell what you're really trying to do here, and why. However, you might be able to make life simpler by using the --backup flag from the cp command. For example:
cp --backup=numbered testappend/migration.pickle migrationfiles/
This will ensure that you have a sequence of migration files like:
migration.pickle
migration.pickle.~1~
migration.pickle.~2~
migration.pickle.~3~
where the older versions have larger ordinal numbers, while the latest version has no ordinal extension. It's a pretty simple system, but works well for a wide variety of use cases. YMMV.
# configuration:
path=migrationfiles
prefix=migration.pickle
# determine number of last file:
last_number=$( find ${path} -name "${prefix}*" | sed -e "s/.*${prefix}//g" | sort -n | tail -1 )
# put together the file names:
migbefore=${prefix}${last_number}
migafter=${prefix}$(( last_number + 1 ))
# test it:
echo $migbefore $migafter
This should work even if there are no migration files yet. In that case, the value of migbefore is just the prefix and does not point to a real file.

Resources