How can i read data from a text file in opentest - opentest

I tried the below code but it throws an error that
Caused by: :1 ReferenceError: "require" is not defined
fs.readFile('test.txt', 'utf-8',function(err,data){
if (err) {
return $log(err);
}
$log(data);
})

You are trying to use the Node.js API, which is not available in OpenTest. The proper way to read a text file is by using the ReadTextFile keyword:
- description: Read text file from disk
action: org.getopentest.actions.files.ReadTextFile
args:
file: $tempDir + "/test.txt"
encoding: UTF-8
- script: |
var fileContents = $output.text;
$log(fileContents)

Related

Getting error in accessing Jmeter variables (variable_MarchNr) in yaml in beanshell script

I am trying to bulk edit one entity called issues, I call GetIssues API and via json extractor, get all issueId's in variable "issueIds"
json extractor to extract issueIds
Now I want to pass these Ids in other api bulk edit issues, If I directly use these array in next API, I get below error:
{"details":"Unexpected token $ in JSON at position 19","metadata":{}}
So I used to below code in Beanshell Post processor:
var mylist;
props.put("mylist", new ArrayList());
log.info("Detected " + vars.get("issueIds_matchNr") + " issueIds");
for (int i=1; i<= Integer.parseInt(vars.get("issueIds_matchNr")); i++) {
log.info("IDs # " + i + ": " + vars.get("issueIds_" + i));
props.get("mylist").add('"' + vars.get("issueIds_" + i) + '"' );
}
log.info(props.get("mylist").toString());
var issueIdList;
vars.put("issueIdList", props.get("mylist").toString());
log.info(vars.get("issueIdList"));
In my next api call if I pass issueIdList variable, then this works fine in jmeter.
sample variable values in debug sampler are like:
issueIdList=["555bcfc2", "33974d2c", "e58db1d6"]
issueIds_1=555bcfc2
issueIds_2=33974d2c
issueIds_3=e58db1d6
issueIds_matchNr=3
Problem I am facing if I convert my jmx2yaml and tried to run this file with
bzt issues.yml
then while executing above shell script, these issueIds_matchNr, issueIds_3 are not detected, I get below error;
2022-05-29 08:26:10,785 INFO o.a.j.e.J.JSR223PostProcessor: Detected null issueIds
2022-05-29 08:26:10,795 ERROR o.a.j.e.JSR223PostProcessor: Problem in JSR223 script, JSR223PostProcessor
javax.script.ScriptException: Sourced file: eval stream : Method Invocation Integer.parseInt : at Line: 4 : in file: eval stream : Integer .parseInt ( vars .get ( "issueIds_matchNr" ) )
Target exception: java.lang.NumberFormatException: null
in eval stream at line number 4
at bsh.engine.BshScriptEngine.evalSource(BshScriptEngine.java:87) ~[bsh-2.0b6.jar:2.0b6 2016-02-05 05:16:19]
My Yaml script is:
- extract-jsonpath:
issueIds:
default: NotFound
jsonpath: $..id
follow-redirects: true
jsr223:
- compile-cache: true
execute: after
language: beanshell
parameters: issueIds
script-file: script.bsh
label: Get Issue Id's
method: GET
url: https://${BASE_URL1}/${PROJECT_ID}/issues?limit=5&sortBy=-displayId&filter%5Bvalid%5D=true
You're missing one important bit: setting the Match Nr in your Taurus YAML
The correct definition of the JSON Extractor would be something like:
extract-jsonpath:
issueIds:
jsonpath: $..id
match-no: -1 #this is what you need to add
Also be informed that starting from JMeter 3.1 it's recommended to use Groovy as the scripting language so consider migrating, it will be as simple as removing the first line from your script.bsh

Google Cloud DLP - CSV inspection

I'm trying to inspect a CSV file and there are no findings being returned (I'm using the EMAIL_ADDRESS info type and the addresses I'm using are coming up with positive hits here: https://cloud.google.com/dlp/demo/#!/). I'm sending the CSV file into inspect_content with a byte_item as follows:
byte_item: {
type: :CSV,
data: File.open('/xxxxx/dlptest.csv', 'r').read
}
In looking at the supported file types, it looks like CSV/TSV files are inspected via Structured Parsing.
For CSV/TSV does that mean one can't just sent in the file, and needs to use the table attribute instead of byte_item as per https://cloud.google.com/dlp/docs/inspecting-structured-text?
What about for XSLX files for example? They're an unspecified file type so I tried with a configuration like so, but it still returned no findings:
byte_item: {
type: :BYTES_TYPE_UNSPECIFIED,
data: File.open('/xxxxx/dlptest.xlsx', 'rb').read
}
I'm able to do inspection and redaction with images and text fine, but having a bit of a problem with other file types. Any ideas/suggestions welcome! Thanks!
Edit: The contents of the CSV in question:
$ cat ~/Downloads/dlptest.csv
dylans#gmail.com,anotehu,steve#example.com
blah blah,anoteuh,
aonteuh,
$ file ~/Downloads/dlptest.csv
~/Downloads/dlptest.csv: ASCII text, with CRLF line terminators
The full request:
parent = "projects/xxxxxxxx/global"
inspect_config = {
info_types: [{name: "EMAIL_ADDRESS"}],
min_likelihood: :POSSIBLE,
limits: { max_findings_per_request: 0 },
include_quote: true
}
request = {
parent: parent,
inspect_config: inspect_config,
item: {
byte_item: {
type: :CSV,
data: File.open('/xxxxx/dlptest.csv', 'r').read
}
}
}
dlp = Google::Cloud::Dlp.dlp_service
response = dlp.inspect_content(request)
The CSV file I was testing with was something I created using Google Sheets and exported as a CSV, however, the file showed locally as a "text/plain; charset=us-ascii". I downloaded a CSV off the internet and it had a mime of "text/csv; charset=utf-8". This is the one that worked. So it looks like my issue was specifically due the file being an incorrect mime type.
xlsx is not yet supported. Coming soon. (Maybe that part of the question should be split out from the CSV debugging issue.)

WASX7017E: Jython exception File "<string>"

When i try to execute my Jython script i get this error:
WASX7017E: Exception received while running file "/opt/test_wsadmin_configura_datasource.jython"; exception information: com.ibm.bsf.BSFException: exception from Jython:
Traceback (innermost last):
File "<string>", line 29, in ?
NameError: dbuser
This is my test_wsadmin_configura_datasource.jython
##
# src: https://www.ibm.com/developerworks/community/blogs/timdp/entry/automating_application_installation_and_configuration_into_websphere_application_server46?lang=en
#
# eseguire con wsadmin.sh -lang jython
# get an environment variable
import os;
#installRoot = os.environ["INSTALL_ROOT"]
# useful variables
cell = AdminControl.getCell()
node = AdminControl.getNode()
server = AdminControl.getConfigId(AdminControl.queryNames("node="+node+",type=Server,*"))
varmap = AdminConfig.list('VariableMap', server)
appman = AdminControl.queryNames("type=ApplicationManager,*")
def createJ2CAuthAlias(alias,description,user,password):
sec = AdminConfig.getid('/Cell:'+ cell +'/Security:/')
alias_attr = ["alias", alias]
desc_attr = ["description", description]
userid_attr = ["userId", user]
password_attr = ["password", password]
attrs = [alias_attr, desc_attr, userid_attr, password_attr]
authdata = AdminConfig.create('JAASAuthData', sec, attrs)
print "J2C Auth Alias created ---> " + alias
AdminConfig.save()
return
createJ2CAuthAlias(dbuser,description,DBUSER,PASS)
WebSphere is running inside original docker image ibmcom/websphere-traditional:8.5.5.11-install
How can i solve?
EDIT1: Here found that the issue can be related to a non-UTF8 character.
These errors can occur because there are UTF-8 characters in the file that are not valid.
...
An easy way to determine if a character that is not valid is causing the error is to enter export LANG=C and run the script again.
export LANG=C does not change the result.
Just found that double quoting arguments does the job:
createJ2CAuthAlias("dbuser","description","DBUSER","PASS")

NativeScript when adding custom resource/json files

I am creating a project to target Android and in it, I want it to come with a .json file from which some data will be loaded.
I put my .json file into the Android folder. When running "tns run android --device 1" (which is my physical device) I get:
-code-gen:
[mergemanifest] No changes in the AndroidManifest files.
[echo] Handling aidl files...
[aidl] No AIDL files to compile.
[echo] ----------
[echo] Handling RenderScript files...
[echo] ----------
[echo] Handling Resources...
[aapt] Found new input file
[aapt] Generating resource IDs...
[aapt] invalid resource directory name: /Users/konrad/Desktop/NativeScript/hello-world/platforms/android/res res
[aapt] invalid resource directory name: /Users/konrad/Desktop/NativeScript/hello-world/platforms/android/res small.json
BUILD FAILED
/usr/local/Cellar/android-sdk/24.3.3/tools/ant/build.xml:649: The following error occurred while executing this line:
/usr/local/Cellar/android-sdk/24.3.3/tools/ant/build.xml:694: null returned: 1
Total time: 0 seconds
Command ant failed with exit code 1
The file is called small.json
EDIT: Even if I remove the file, the problem still remains.
The property of Android is that it restricts access to some parts of the file system. There are two ways of accessing files on NativeScript for Android:
1) JavaScript way
var fs = require('file-system');
var documents = fs.knownFolders.documents();
var myFile = documents.getFile(filename);
as specified in the docs: https://github.com/NativeScript/docs/blob/master/ApiReference/file-system/HOW-TO.md
2) Android specific way including a third party library called "json-simple" which needs to be attached using the command tns library add android .
var context = app.android.context;
var assets = context.getAssets();
var br = new java.io.BufferedReader(new java.io.InputStreamReader(assets.open(filename)));
var parser = new org.json.simple.parser.JSONParser();
try {
parser.parse(br);
} catch (ex) {
var javaEx = ex.nativeException;
var msg = javaEx.getMessage();
console.log("whops! : "+msg);
}

Archive::Any gives IO error

#!/usr/bin/perl
use strict;
use warnings;
my $archive_files = "C:\\Temp\\FREMOTE\\test.zip";
sub extract_archive($$);
extract_archive($archive_files, "C:\\Temp\\FREMOTE\\TEST\\");
extract_archive("C:\\Temp\\FREMOTE\\TEST\\testb.zip",
"C:\\Temp\\FREMOTE\\TEST\\testb\\");
sub extract_archive($$) {
my $archive_file = shift;
my $extract_dir = shift;
if (! -d "$extract_dir") {
mkdir $extract_dir;
}
use Archive::Any;
my $archive = Archive::Any->new($archive_file);
if($archive->extract($extract_dir)) {
print "Extracted $archive_file into $extract_dir\n";
undef $archive;
} else {
print "Failed to extracted $archive_file into $extract_dir\n";
}
}
I got the following error. How do I resolve it?
IO error: write error during copy : Bad file descriptor
at C:/Perl/site/lib/Archive/Any.pm line 193.
IO error: write error during copy : Bad file descriptor
at C:/Perl/site/lib/Archive/Any.pm line 193.
IO error: write error during copy : Bad file descriptor
at C:/Perl/site/lib/Archive/Any.pm line 193.
IO error: write error during copy : Bad file descriptor
at C:/Perl/site/lib/Archive/Any.pm line 193.
I tested it with the following code. Using two known-good zip files, I added the second zip file into the first - to reproduce what I believe you are doing. With the original code I kept receiving an error during the extraction of the second file:
Extracted C:\Temp\colorbox-master.zip into C:\Temp\FREMOTE\TEST\<br>
Can't call method "extract" on an undefined value at Perl-1.pl line 19.
Different from your error, but fixed with the following code:
#!/usr/bin/perl
use strict;
use warnings;
my $archive_files = "C:\\Temp\\colorbox-master.zip";
extract_archive($archive_files, "C:\\Temp\\FREMOTE\\TEST\\");
extract_archive("C:\\Temp\\FREMOTE\\TEST\\easybox-v1.3.zip", "C:\\Temp\\FREMOTE\\TEST\\testb\\");
sub extract_archive {
my $archive_file = shift;
my $extract_dir = shift;
if (!-d "$extract_dir") {
mkdir $extract_dir;
}
use Archive::Any;
my $archive = Archive::Any->new($archive_file);
if($archive->extract($extract_dir)) {
print "Extracted $archive_file into $extract_dir\n";
undef $archive;
} else {
print "Failed to extracted $archive_file into $extract_dir\n";
}
}
Extracted C:\Temp\colorbox-master.zip into C:\Temp\FREMOTE\TEST\
Extracted C:\Temp\FREMOTE\TEST\easybox-v1.3.zip into C:\Temp\FREMOTE\TEST\testb\
Note that I had just installed the 'Archive::Any-0.0932' module (ActiveState Perl) so I may have a different (fixed) version. You may want to check that your module is at the most recent version. And that your zip files are not broken.

Resources