can not open jmeter no-gui from shell script - jmeter

I am trying to rum jmeter from a shell script but its giving me error:
Error in NonGUIDriver java.lang.IllegalArgumentException: Problem loading XML from:'/dynamicJmeter.jmx', missing class com.thoughtworks.xstream.converters.ConversionException:
---- Debugging information ----
cause-exception : com.thoughtworks.xstream.converters.ConversionException
cause-message :
first-jmeter-class : org.apache.jmeter.save.converters.HashTreeConverter.unmarshal(HashTreeConverter.java:67)
class : org.apache.jmeter.save.ScriptWrapper
required-type : org.apache.jorphan.collections.ListedHashTree
converter-type : org.apache.jmeter.save.ScriptWrapperConverter
path : /jmeterTestPlan/hashTree/hashTree/hashTree[8]/com.blazemeter.jmeter.RandomCSVDataSetConfig
line number : 1309
version : 4.0 r1823414
-------------------------------
I have installed the plugin manager and udp support plugin in jmeter but still the command doesn't work.

You're missing Random CSV Data Set Config plugin which is referenced in your .jmx test script and not installed on the JMeter instance where you're trying to open this file.
So make sure to install the plugin using JMeter Plugins Manager
and restart JMeter afterwards to pick up the plugin.
Just in case check out Introducing the Random CSV Data Set Config Plugin on JMeter article for comprehensive plugin installation and usage instructions

Related

GitLab CI CD runner not loading properties file for profile

When I run a command mvn clean test -Dspring.profiles.active=GITLAB-CI-TEST in the GitLab CI CD it not loading properties file application-gitlab-ci-test.properties. It is loading only application.properties.
As file application-gitlab-ci-test.properties contains the different value for spring.datasource.url the pipeline is failing in the remote runners with error
The last packet sent successfully to the server was 0 milliseconds ago.
The driver has not received any packets from the server.
Of course, this error is expected as properties file application.properties refers to the localhost database.
Code which loading application-gitlab-ci-test.properties:
#Profile("GITLAB-CI-TEST")
#PropertySource("classpath:application-gitlab-ci-test.properties")
#Configuration
public class GitLabCiTestProfile {
}
When I try to run the same command locally it's working as expected and in logs, I see the following records:
2020-03-30 19:23:00.609 DEBUG 604 --- [ main]
o.s.b.c.c.ConfigFileApplicationListener : Loaded config file
'file:/G:/****/****/****/****/target/classes/application.properties'
(classpath:/application.properties)
2020-03-30 19:23:00.609 DEBUG 604 --- [ main]
o.s.b.c.c.ConfigFileApplicationListener : Loaded config file
'file:/G:/****/****/****/****/target/classes/application-GITLAB-CI-TEST.properties' (classpath:/application-GITLAB-CI-TEST.properties) for profile
GITLAB-CI-TEST
I noticed that remote runners missing the second line. This one which loading application-GITLAB-CI-TEST.properties.
I also tried mvn clean test --batch-mode -PGITLAB-CI-TEST and this one too failing in the remote host but in local run working as expected.
I found the workaround for this issue by using the command
mvn clean test --batch-mode -Dspring.datasource.url=jdbc:mysql://mysql-db:3306/*******?useSSL=false&allowPublicKeyRetrieval=true
Can you please help me to solve this issue as this workaround is not satisfying me?
I found the solution to this issue.
I changed the name of the profile from the upper case (GITLAB-CI-TEST) to lower case (gitlab-ci-test), to match the lower case of profile name in properties file - application-gitlab-ci-test.properties.
Now in the remote runner, I'm using the following command:
mvn clean test -Dspring.profiles.active=gitlab-ci-test
Spring doc - link

Elastic Search error

I 'm intending to fix bugs on Elastic Search open-source project. I forked it and cloned the forked copy . Then I imported it as Maven project on Eclipse and then did Maven build . So far so good.
I opened ElasticSearchF.java file and tried to run it as a Java application.(This is as per directions written in http://www.lindstromhenrik.com/debugging-elasticsearch-in-eclipse/).
But I get an error saying path.home is not set for ElasticSearch and throws an error saying IllegalStateException.
My question is
Why is this error coming in the first place.
As I said , I want to fix bugs in ElasticSearch project.Is this the right way to set-up environment for my goal? Or should I have a client send the requests to the ElasticSearch server and then set-up debug points in Elastic Search source code. How to achieve this?
Thanks for your patience.
Update:
I did add VM argument as mentioned by one of the answerers.
Then it throws different errors and clue-less about why its throwing that.
java.io.IOException: Resource not found: "org/joda/time/tz/data/ZoneInfoMap" ClassLoader: sun.misc.Launcher$AppClassLoader#29578426
at org.joda.time.tz.ZoneInfoProvider.openResource(ZoneInfoProvider.java:210)
at org.joda.time.tz.ZoneInfoProvider.<init>(ZoneInfoProvider.java:127)
at org.joda.time.tz.ZoneInfoProvider.<init>(ZoneInfoProvider.java:86)
at org.joda.time.DateTimeZone.getDefaultProvider(DateTimeZone.java:514)
at org.joda.time.DateTimeZone.getProvider(DateTimeZone.java:413)
at org.joda.time.DateTimeZone.forID(DateTimeZone.java:216)
at org.joda.time.DateTimeZone.getDefault(DateTimeZone.java:151)
at org.joda.time.chrono.ISOChronology.getInstance(ISOChronology.java:79)
at org.joda.time.DateTimeUtils.getChronology(DateTimeUtils.java:266)
at org.joda.time.format.DateTimeFormatter.selectChronology(DateTimeFormatter.java:968)
at org.joda.time.format.DateTimeFormatter.printTo(DateTimeFormatter.java:672)
at org.joda.time.format.DateTimeFormatter.printTo(DateTimeFormatter.java:560)
at org.joda.time.format.DateTimeFormatter.print(DateTimeFormatter.java:644)
at org.elasticsearch.Build.<clinit>(Build.java:53)
at org.elasticsearch.node.Node.<init>(Node.java:138)
at org.elasticsearch.node.NodeBuilder.build(NodeBuilder.java:157)
at org.elasticsearch.bootstrap.Bootstrap.setup(Bootstrap.java:177)
at org.elasticsearch.bootstrap.Bootstrap.main(Bootstrap.java:278)
at org.elasticsearch.bootstrap.ElasticsearchF.main(ElasticsearchF.java:30)
[2015-06-16 18:51:36,892][INFO ][node ] [Kismet Deadly] version[2.0.0-SNAPSHOT], pid[2516], build[9b833fd/2015-06-15T03:38:40Z]
[2015-06-16 18:51:36,892][INFO ][node ] [Kismet Deadly] initializing ...
[2015-06-16 18:51:36,899][INFO ][plugins ] [Kismet Deadly] loaded [], sites []
{2.0.0-SNAPSHOT}: Initialization Failed ...
- ExceptionInInitializerError
IllegalArgumentException[An SPI class of type org.apache.lucene.codecs.PostingsFormat with name 'Lucene50' does not exist. You need to add the corresponding JAR file supporting this SPI to your classpath. The current classpath supports the following names: [es090, completion090, XBloomFilter]]
I got help from the developer community in https://github.com/elastic/elasticsearch/issues/12737 and was able to debug it.
procedure in short would be :
1) Search for the file Elasticsearch.java/ElasticsearchF.java inside the package org.elasticsearch.bootstrap .
2) Right click -> Run Configurations...
3) In the window that pops up , Click the "Arguments" tab and under "Program arguments:" section give the value as start
and under "VM arguments:" section give the value as
-Des.path.home={path to your elasticsearch code base root folder}/core -Des.security.manager.enabled=false
4) Click "Apply" and then click "Run".
It runs now.
to check , go to localhost:9200 and you will get a message something like
{
"name" : "Raza",
"cluster_name" : "elasticsearch",
"version" : {
"number" : "2.0.0-beta1",
"build_hash" : "${buildNumber}",
"build_timestamp" : "NA",
"build_snapshot" : true,
"lucene_version" : "5.2.1"
},
"tagline" : "You Know, for Search"
}
for more info on arguments
see : https://github.com/elastic/elasticsearch/commit/2b9ef26006c0e4608110164480b8127dffb9d6ad
Edit your debug/run configurations,put it on the vm arguments:
-Des.path.home=C:\github\elasticsearch\
change the C:\github\elasticsearch\ to your elasticsearch root path
the reason is some arguments in the elasticsearch.bat is missed when you debug/run it in eclipse

uk.co.logtailer.jmeter.protocol.mq.sampler.MQSampler in JMeter script

JMeter crashes while trying to load a .JMX script giving error "Empty TestPlan - see log file". I did my research around and found that the possible solutions/issues can be:
1) Moving up/down Java versions on your machine.
2) Some JAR is missing in the lib/ext folder of JMeter.
The issue seems to be the latter, as I can see the following line in the JMX script:
<uk.co.logtailer.jmeter.protocol.mq.sampler.MQSampler guiclass="uk.co.logtailer.jmeter.protocol.mq.control.gui.MQSamplerGui" testclass="uk.co.logtailer.jmeter.protocol.mq.sampler.MQSampler" testname="MQSampler" enabled="false">
I am not able to find the JAR which can support this MQ Sampler. I have tried a few from ActiveMQ but they didn't work.
I will appreciate if someone can help me with the JAR or point out if my understanding of the issue is wrong.
Log shows:
2014/09/04 10:36:12 ERROR - jmeter.save.SaveService: Conversion error com.thoughtworks.xstream.converters.ConversionException: uk.co.logtailer.jmeter.protocol.mq.sampler.MQSampler : uk.co.logtailer.jmeter.protocol.mq.sampler.MQSampler
---- Debugging information ----
message : uk.co.logtailer.jmeter.protocol.mq.sampler.MQSampler
cause-exception : com.thoughtworks.xstream.mapper.CannotResolveClassException
cause-message : uk.co.logtailer.jmeter.protocol.mq.sampler.MQSampler
class : org.apache.jorphan.collections.ListedHashTree
required-type : org.apache.jorphan.collections.ListedHashTree
converter-type : org.apache.jmeter.save.converters.HashTreeConverter
path : /jmeterTestPlan/hashTree/hashTree/hashTree[4]/uk.co.logtailer.jmeter.protocol.mq.sampler.MQSampler
line number : 65
------------------------------- : uk.co.logtailer.jmeter.protocol.mq.sampler.MQSampler : uk.co.logtailer.jmeter.protocol.mq.sampler.MQSampler
The problem is that you don't have the jar, containing uk.co.logtailer.jmeter.protocol.mq.sampler.MQSampler and uk.co.logtailer.jmeter.protocol.mq.control.gui.MQSamplerGui classes in your JMeter's classpath. You need to find it somewhere and drop it to lib/ext folder of your JMeter installation.
However looking into enabled="false" stanza, if all these custom samplers are disabled you can safely (and carefully) remove them from .jmx file via any text editor, preferably having XML syntax highlighting and XML validation capabilities.
A couple of guides on JMS/MQ testing with JMeter:
Building a JMS Topic Test Plan
Building a JMS Testing Plan - Apache JMeter

DBPedia Live mirror setup on Mac OS X

I am trying to set up a DBpedia Live Mirror on my personal Mac machine. Here is some technical host information about my setup:
Operating System: OS X 10.9.3
Processor 2.6 GHz Intel Core i7
Memory 16 GB 1600 MHz DDR3
Database server used for hosting data for the DBpedia Live Mirror: OpenLink Virtuoso (Open-source edition)
Here's a summary of the steps I followed so far:
Downloaded the initial data seed from DBPedia Live as: dbpedia_2013_07_18.nt.bz2
Downloaded the synchronization tool from http://sourceforge.net/projects/dbpintegrator/files/.
Executed the virtload.sh script. Had to tweak some commands in here to be compatible with OS X.
Adapted the synchronization tools configuration files according to the README.txt file as follows:
a) Set the start date in file "lastDownloadDate.dat" to the date of that dump (2013-07-18-00-000000).
b) Set the configuration information in file "dbpedia_updates_downloader.ini", such as login credentials for Virtuoso, and GraphURI.
Executed "java -jar dbpintegrator-1.1.jar" on the command line.
This script repeatedly showed the following error:
INFO - Options file read successfully
INFO - File : http://live.dbpedia.org/changesets/lastPublishedFile.txt has been successfully downloaded
INFO - File : http://live.dbpedia.org/changesets/2014/06/16/13/000001.removed.nt.gz has been successfully downloaded
WARN - File /Users/shruti/virtuoso/dbpedia-live/UpdatesDownloadFolder/000001.removed.nt.gz cannot be decompressed due to Unexpected end of ZLIB input stream
ERROR - Error: (No such file or directory)
INFO - File : http://live.dbpedia.org/changesets/2014/06/16/13/000001.added.nt.gz has been successfully downloaded
WARN - File /Users/shruti/virtuoso/dbpedia-live/UpdatesDownloadFolder/000001.added.nt.gz cannot be decompressed due to Unexpected end of ZLIB input stream
ERROR - Error: (No such file or directory)
INFO - File : http://live.dbpedia.org/changesets/lastPublishedFile.txt has been successfully downloaded
INFO - File : http://live.dbpedia.org/changesets/2014/06/16/13/000002.removed.nt.gz has been successfully downloaded
INFO - File : /Users/shruti/virtuoso/dbpedia-live/UpdatesDownloadFolder/000002.removed.nt.gz decompressed successfully to /Users/shruti/virtuoso/dbpedia-live/UpdatesDownloadFolder/000002.removed.nt
WARN - null Function executeStatement
WARN - null Function executeStatement
WARN - null Function executeStatement
WARN - null Function executeStatement
WARN - null Function executeStatement
...
Questions
Why do I repeatedly see the following error when running the Java program: "dbpintegrator-1.1.jar"? Does this mean that the triples from these files were not updated in my live mirror?
WARN - File /Users/shruti/virtuoso/dbpedia-live/UpdatesDownloadFolder/000001.removed.nt.gz cannot be decompressed due to Unexpected end of ZLIB input stream
ERROR - Error: (No such file or directory)
How can I verify that the data loaded in my mirror is up to date? Is there a SPARQL query I can use to validate this?
I see that the data in my live mirror is missing wikiPageId (http://dbpedia.org/ontology/wikiPageID) and wikiPageRevisionID. Why is that? Is this data missing from the DBpedia live data dumps?
It should be fixed now.
Can you try again from here: https://github.com/dbpedia/dbpedia-live-mirror

How can i fix the Oracle Enerprise Manager error - Agent is blocked?

I have installed OEM 12c and after restarting my oms the agents got the following error:
Heartbeat Status : Agent is blocked
Blocked Reason : Plugin mismatch found between agent and repository.
I have restarted the agent but sill no success.
how to fix this ?
I suggest you re-sync your agent using the EM GUI !
Here is a article showing how this is done !
In short :
Check your agent status :
./emctl status agent
Oracle Enterprise Manager Cloud Control 12c Release 3
Copyright (c) 1996, 2013 Oracle Corporation. All rights reserved.
---------------------------------------------------------------
Agent Version : 12.1.0.3.0
OMS Version : (unknown)
Protocol Version : 12.1.0.1.0
Agent Home : /em_agent12.3/agent_inst
Agent Binaries : /em_agent12.3/core/12.1.0.3.0
Agent Process ID : 64164
Parent Process ID : 64110
Agent URL : https://bih002:3872/emd/main/
Repository URL : https://dcg023:4900/empbs/upload
Started at : 2013-11-13 15:26:08
Started by user : em_user
Last Reload : (none)
Last successful upload : (none)
Last attempted upload : 2013-11-13 15:26:14
Total Megabytes of XML files uploaded so far : 0
Number of XML files pending upload : 72
Size of XML files pending upload(MB) : 0.11
Available disk space on upload filesystem : 91.16%
Collection Status : Collections enabled
Heartbeat Status : **Agent is blocked**
Next re-sync your agent using the GUI:
Now check you agent status again!

Resources