I have the follow Json:
{
"tipopersona": "M",
"regimen": "36",
"periodicidad": "Y",
"ejercicio": "2020",
"periodo": "035",
"periodoDesc": "Del Ejercicio",
"tipoDeclaracion": "001",
"tipoDeclaracionDesc": "Normal",
"tipoComplementaria": "",
"tipoComplementariaDesc": "",
"cmbISSIF": "1",
"obligaciones": [ "0101", "0192" ],
"preguntasPerfil": { "178": "1" },
"rfc": "AAC920317CM8",
"rechazopropuesta": false,
"fechaVencimiento": "2021-03-31T00:00:00",
"errores": true,
"xmldeclaracion": ""
}
I need to remove the blanks and newlines from the Json like the following example in JSR223 or Beanshell:
{"tipopersona":"M","regimen":"36","periodicidad":"Y","ejercicio":"2020","periodo":"035","periodoDesc":"DelEjercicio","tipoDeclaracion":"001","tipoDeclaracionDesc":"Normal","tipoComplementaria":"","tipoComplementariaDesc":"","cmbISSIF":"1","obligaciones":["0101","0192"],"preguntasPerfil":{"178":"1"},"rfc":"AAC920317CM8","rechazopropuesta":false,"fechaVencimiento":"2021-03-31T00:00:00","errores":true,"xmldeclaracion":""}
I'm using the follow code:
String data = vars.get("json");
data = data.replaceAll(" ", "");
data = data.replaceAll(System.getProperty("line.separator"),"");
vars.put("data", data);
But in the Debug Sampler it still shows the spaces:
enter image description here
Since JMeter 3.1 you should be using JSR223 Test Elements and Groovy language for scripting
Groovy has built-in JSON support
Assuming above points you can achieve your goal using the following one-liner:
vars.put('data', (new groovy.json.JsonBuilder(new groovy.json.JsonSlurper().parseText(vars.get('json'))).toString()))
More information on Groovy scripting in JMeter: Apache Groovy - Why and How You Should Use It
Easiest way is to use a JSON extractor on json variable as a child to the sampler.Your code might replace all the spaces in json key and value also.
Related
In JMeter, I want to construct the request parameter value from a dataset file based on the PropertyCount.
Dataset
PropertyCount propertyid1 propertyid2 propertyid3
2 13029526 15763743
3 13029526 15763743 12345645
2 13029526 15763743
Request Input Parameter
"values":["13029526","15763743"]
"values":[${outputString}]
PreProcessor Script
With the the below preprocessor script, I am getting the following output but looking to get the values as in Request Input Parameter, with quotes.
2021-08-29 22:15:04,706 INFO o.a.j.m.J.JSR223 PreProcessor: Required output: 13029526,15763743,
2021-08-29 22:15:04,785 INFO o.a.j.m.J.JSR223 PreProcessor: Required output: 13029526,15763743,
JSR223 PreProcessor
def requiredOutput = new StringBuilder()
1.upto(vars.get('propertycount') as int, {
requiredOutput.append(vars.get('propertyid' + it))
requiredOutput
requiredOutput.append(',')
vars.put("outputString",requiredOutput.toString());
})
You're seem to be constructing a JSON Array therefore it makes more sense to consider using Groovy's JsonBuilder instead of doing manual string concatenation:
def outputString = []
1.upto(vars.get('PropertyCount') as int, {
outputString.add(vars.get("propertyid$it"))
})
vars.put('outputString', new groovy.json.JsonBuilder(outputString).toPrettyString())
More information:
Apache Groovy - Parsing and producing JSON
Apache Groovy - Why and How You Should Use It
I'am trying to dynamically replace my zip file pattern in rtDownload in my declarative jenkinsfile but the value is not taken up by the function.
A zip file with a newer version is created and uploaded on JFrog with every build and I want to download a particular version in my local system.
I have defined "VERSION" in def VERSION and used it as -
rtDownload (
serverId: 'Jfrog',
spec: '''{
"files": [
{
"pattern": "<path>-<filname>${VERSION}.zip",
"target": "<path>-<filename>${VERSION}.zip",
"flat": "true"
}
]
}'''
)
But it does not replace my VERSION with the version I am providing through String parameter configured in the jenkins job.
Any suggestions would be greatly appreciated. Thanks.
In groovy single quote strings ('') and single quote multi line strings (''' ''') do not support string interpolation, only double quote strings support this.
You can read more info on string interpolation in groovy in the Official Documentation.
So to fix it just use double quoted multi line string, that will enable the evaluation of your parameters:
rtDownload (
serverId: 'Jfrog',
spec: """{
"files": [
{
"pattern": "<path>-<filname>${VERSION}.zip",
"target": "<path>-<filename>${VERSION}.zip",
"flat": "true"
}
]
}"""
)
I have a yaml file (also used in a azure devops pipeline so needs to be in this format) which contains some settings I'd like to directly access from my terraform module.
The file looks something like:
variables:
- name: tenantsList
value: tenanta,tenantb
- name: unitName
value: canary
I'd like to have a module like this to access the settings but I can't see how to get to the bottom level:
locals {
settings = yamldecode(file("../settings.yml"))
}
module "infra" {
source = "../../../infra/terraform/"
unitname = local.settings.variables.unitName
}
But the terraform plan errors with this:
Error: Unsupported attribute
on canary.tf line 16, in module "infra":
16: unitname = local.settings.variables.unitName
|----------------
| local.settings.variables is tuple with 2 elements
This value does not have any attributes.
It seems like the main reason this is difficult is because this YAML file is representing what is logically a single map but is physically represented as a YAML list of maps.
When reading data from a separate file like this, I like to write an explicit expression to normalize it and optionally transform it for more convenient use in the rest of the Terraform module. In this case, it seems like having variables as a map would be the most useful representation as a Terraform value, so we can write a transformation expression like this:
locals {
raw_settings = yamldecode(file("${path.module}/../settings.yml"))
settings = {
variables = tomap({
for v in local.raw_settings.variables : v.name => v.value
})
}
}
The above uses a for expression to project the list of maps into a single map using the name values as the keys.
With the list of maps converted to a single map, you can then access it the way you originally tried:
module "infra" {
source = "../../../infra/terraform/"
unitname = local.settings.variables.unitName
}
If you were to output the transformed value of local.settings as YAML, it would look something like this, which is why accessing the map elements directly is now possible:
variables:
tenantsList: tenanta,tenantb
unitName: canary
This will work only if all of the name strings in your input are unique, because otherwise there would not be a unique map key for each element.
(Writing a normalization expression like this also doubles as some implicit validation for the shape of that YAML file: if variables were not a list or if the values were not all of the same type then Terraform would raise a type error evaluating that expression. Even if no transformation is required, I like to write out this sort of expression anyway because it serves as some documentation for what shape the YAML file is expected to have, rather than having to study all of the references to it throughout the rest of the configuration.)
With my multidecoder for YAML and JSON you are able to access multiple YAML and/or JSON files with their relative paths in one step.
Documentations can be found here:
Terraform Registry -
https://registry.terraform.io/modules/levmel/yaml_json/multidecoder/latest?tab=inputs
GitHub:
https://github.com/levmel/terraform-multidecoder-yaml_json
Usage
Place this module in the location where you need to access multiple different YAML and/or JSON files (different paths possible) and pass
your path/-s in the parameter filepaths which takes a set of strings of the relative paths of YAML and/or JSON files as an argument. You can change the module name if you want!
module "yaml_json_decoder" {
source = "levmel/yaml_json/multidecoder"
version = "0.2.1"
filepaths = ["routes/nsg_rules.yml", "failover/cosmosdb.json", "network/private_endpoints/*.yaml", "network/private_links/config_file.yml", "network/private_endpoints/*.yml", "pipeline/config/*.json"]
}
Patterns to access YAML and/or JSON files from relative paths:
To be able to access all YAML and/or JSON files in a folder entern your path as follows "folder/rest_of_folders/*.yaml", "folder/rest_of_folders/*.yml" or "folder/rest_of_folders/*.json".
To be able to access a specific YAML and/or a JSON file in a folder structure use this "folder/rest_of_folders/name_of_yaml.yaml", "folder/rest_of_folders/name_of_yaml.yml" or "folder/rest_of_folders/name_of_yaml.json"
If you like to select all YAML and/or JSON files within a folder, then you should use "*.yml", "*.yaml", "*.json" format notation. (see above in the USAGE section)
YAML delimiter support is available from version 0.1.0!
WARNING: Only the relative path must be specified. The path.root (it is included in the module by default) should not be passed, but everything after it.
Access YAML and JSON entries
Now you can access all entries within all the YAML and/or JSON files you've selected like that: "module.yaml_json_decoder.files.[name of your YAML or JSON file].entry". If the name of your YAML or JSON file is "name_of_your_config_file" then access it as follows "module.yaml_json_decoder.files.name_of_your_config_file.entry".
Example of multi YAML and JSON file accesses from different paths (directories)
first YAML file:
routes/nsg_rules.yml
rdp:
name: rdp
priority: 80
direction: Inbound
access: Allow
protocol: Tcp
source_port_range: "*"
destination_port_range: 3399
source_address_prefix: VirtualNetwork
destination_address_prefix: "*"
---
ssh:
name: ssh
priority: 70
direction: Inbound
access: Allow
protocol: Tcp
source_port_range: "*"
destination_port_range: 24
source_address_prefix: VirtualNetwork
destination_address_prefix: "*"
second YAML file:
services/logging/monitoring.yml
application_insights:
application_type: other
retention_in_days: 30
daily_data_cap_in_gb: 20
daily_data_cap_notifications_disabled: true
logs:
# Optional fields
- "AppMetrics"
- "AppAvailabilityResults"
- "AppEvents"
- "AppDependencies"
- "AppBrowserTimings"
- "AppExceptions"
- "AppExceptions"
- "AppPerformanceCounters"
- "AppRequests"
- "AppSystemEvents"
- "AppTraces"
first JSON file:
test/config/json_history.json
{
"glossary": {
"title": "example glossary",
"GlossDiv": {
"title": "S",
"GlossList": {
"GlossEntry": {
"ID": "SGML",
"SortAs": "SGML",
"GlossTerm": "Standard Generalized Markup Language",
"Acronym": "SGML",
"Abbrev": "ISO 8879:1986",
"GlossDef": {
"para": "A meta-markup language, used to create markup languages such as DocBook.",
"GlossSeeAlso": ["GML", "XML"]
},
"GlossSee": "markup"
}
}
}
}
}
main.tf
module "yaml_json_multidecoder" {
source = "levmel/yaml_json/multidecoder"
version = "0.2.1"
filepaths = ["routes/nsg_rules.yml", "services/logging/monitoring.yml", test/config/*.json]
}
output "nsg_rules_entry" {
value = module.yaml_json_multidecoder.files.nsg_rules.aks.ssh.source_address_prefix
}
output "application_insights_entry" {
value = module.yaml_json_multidecoder.files.monitoring.application_insights.daily_data_cap_in_gb
}
output "json_history" {
value = module.yaml_json_multidecoder.files.json_history.glossary.title
}
Changes to Outputs:
nsg_rules_entry = "VirtualNetwork"
application_insights_entry = 20
json_history = "example glossary"
I am using jmeter for testing the performance of a mobile appication which is using IBM Worklight. I am getting 3 dynamic values which comes as a string and i need to handle these values. i trie reular expression extractor but it didnt work. Can any one help me to find out a solution. Dyanamic values are
["{\"jsessionid\":\"0000Mhn7GqWMU1P7Xi9dpJ7mgFb\",\"mbparam\":\"ZjurDsggbg9CZBgd5miAIHMIH%2B5oC7XdSukctItof7AJnpe8UNhlBsgM%2F8w%3D\",\"MP-AUTH-TOKEN\":\"leXozMVUXFYixuYwxgV58EXuRg1Vd0xtpZeouAMQtk6Pd0I1D618motg\"}"]
Updated
I tried the regular expression you provided but it's also not working.
These are the steps i have performed. Please guide me if i have done anything wrong.
Updated
This is the response i am getting is
{
: "customerName":"abc",
: "homeEmail":"",
: "profileDebitAcc":"01234567",
: "sessKey":"0000V3EgdxpY937GTWQ3yogRLGq",
: "mbParam":"hDurAxWHjPT%2BtB7xEyz7Huu51oDOAH8gyNSWIBnHmA9UWuF0lcHGiOy82S0%3D",
: "responseHeaders":
: {
: : "Content-Language":"en-AU",
: : "Date":"Thu, 12 Nov 2015 05:59:50 GMT",
: : "Content-Length":"6759",
: : "Expires":"0",
: : "Content-Type":"text/html; charset=ISO-8859-1",
: : "X-Powered-By":"Servlet/3.0",
: : "Cache-Control":"no-cache",
: : "Pragma":"no-cache"
: },
: "AuthToken":"AHWXZlUt6Rupm1FeBWGu2TEVHZemZwVGbmwmpVxXJR7TMhCA8pWN96ae",
: "statusCode":200,
:
I need to extract sesskey, mbParam and AuthToken values and send them as list in the next request body.
In the request these values are displayed as
["{\"jsessionid\":\"0000gPQCV4FJ1NQvB8d4Ifd_P9I\",\"mbparam\":\"hDu7DhU%2FjA81TEjwbREmytgqIItmUS4b6rhEojYtcalv0PUs6iaewmtZu6U%3D\",\"MP-AUTH-TOKEN\":\"4fU7Bg20sRRUikHnzmZKcC4ZPyCjVxJnmm7QMnSm6mfT7GlqnySQS2YP\"}"]
How to handle these values?
Use the following Regular Expression Extractor configuration:
Reference Name: anything meaningful, i.e. dynamicvalues
Regular Expresssion:
\["\{\\"jsessionid\\":\\"(.+?)\\",\\"mbparam\\":\\"(.+?)\\",\\"MP-AUTH-TOKEN\\":\\"(.+?)\\"\}"\]
Template: $1$$2$$3$
Refer Extracted values as:
${dynamicvalues_g1} - for jsessionid
${dynamicvalues_g2} - for mbparam
${dynamicvalues_g3} - for MP-AUTH-TOKEN
While developing your regular expression remember that you need to escape the following characters with a backslash:
[
{
\
}
]
Other special characters which need escaping are: .^$*+?()|
References:
Regular Expressions page of JMeter's User Manual
PCRE Regex Cheatsheet
Using RegEx (Regular Expression Extractor) with JMeter
jMeter - Regular Expressions
The following JSON is a transaction what will be sent to the Ripple Network to query accounts that hold cryptographic assets at a Gateway (somewhat like a bank, more like a trust account between its clients). This script is to be used in conjunction with PHP to fetch a Gateway's issued balances and ignored it's hot-wallet or day-to-day operations wallet. My question is what is the proper way to:
a. Assign JSON within a Ruby variable?
b. What is the best way to escape double quotes and deal with newlines where brackets and square brackets occur within the JSON syntax?
The JSON follows:
ripple_path="/home/rippled/build/rippled"
conf = "--conf /etc/rippled/rippled.cfg"
puts "About to set the JSON lines "
gatewayStart = "\"method\": \"gateway_balances\","
paramsLine = "\"params\": [ {"
accountLine = "\"account\": \"rGgS5Hw3PhSp3VNT43PDTXze9YfdthHUH\","
hotwalletLine = "\"hotwallet\": \"rKYNhsT3aLymkGH7WL7ZUHkm6RE27iuM4C\","
liLine = "\"ledger_index\": \"validated\","
strictLine = "\"strict\": "
trueLine = true
endLine = " } ] }"
balancesLine = "#{gatewayStart} #{paramsLine} #{accountLine} #>{hotwalletLine} #{liLine} #{strictLine} #{trueLine} #{endLine}"
lineString = "#{balancesLine.to_s}"
linetoJSON = "#{lineString}"
puts "linetoJSON: #{linetoJSON} "
cmd2=`#{ripple_path} #{conf} json gateway_balances #{linetoJSON}`
cmder="#{ripple_path} #{conf} json gateway_balances #{linetoJSON}"
puts "Done."
The output is:
root#xagate:WorkingDirectory# ruby gatewaybal.rb
About to set the JSON lines
linetoJSON: "method": "gateway_balances", "params": [ { "account":
"rGgS5Hw3PhSp3VNT43PDTXze9YfdthHUH", "hotwallet": "rKYNhsT3aLymkGH7WL7ZUHkm6RE27iuM4C", "ledger_index": "validated", "strict":rue } ] }
Loading: "/etc/rippled/rippled.cfg"
rippled [options] <command> <params>
General Options:
-h [ --help ] Display this message.
.....
Done.
It is noteworthy that this command also returns a badSyntax error when executed manually via the command line. Please see here for the mirror of this issue raised on the ripple forums.
jsonLine = "'{ \"account\": \"rGgS5Hw3PhSp3VNT43PDTXze9YfdthHUH\", \"hotwallet\": \"rKYNhsT3aLymkGH7WL7ZUHkm6RE27iuM4C\", \"ledger_index\": \"validated\", \"strict\": true }'"
Is the proper way to assign this JSON within a single variable; this solution was provided by JoelKatz. The completed code is now available on GitHub.