I want to set up a Gatling testcase to put messages on a Oracle AQ. But I have no idea how to set up the the following:
val jmsConfig = jms
.connectionFactoryName(???)
.url("tcp://localhost:10002")
.credentials("user", "secret")
.contextFactory(???)
.listenerCount(1)
.usePersistentDeliveryMode
What is the connection factory name and what is the context factory?
I managed to get it working using the oracle.jms.AQjmsInitialContextFactory. The above mentioned oracle.jms.AQjmsFactory is not an InitialContextFactory so that wont work.
Make sure to add at least version 11+ of the Oracle AQ dependencies for the AQjmsInitialContextFactory to be found.
You database user should of-course have the correct privileges to be able to insert messages into the queue (table).
Gatling expects your to have request-reply semantics, so it will wait for a reply to be received. I actually wanted to abort waiting for a reply after a specified period, but I have no clue how to do that. So if anyone know how to.. please tell me :-)
MySimulation.scala
val jmsConfig = jms
.connectionFactoryName("ConnectionFactory") // MUST!!!! be called ConnectionFactory, the AQjmsFactory expects this naming convention!
.url("jdbc:oracle:thin:#host:1521:SID")
.credentials("user", "password")
.contextFactory("oracle.jms.AQjmsInitialContextFactory")
.listenerCount(1)
.usePersistentDeliveryMode
// TODO check how to set a timeout on the reply
val jmsScenario = scenario("JMS DSL test")
.repeat(1) {
exec(
jms("req reply testing")
.reqreply
.queue("AQ_ADMIN.QUEUE_NAME")
.textMessage("some message")
.check(simpleCheck(checkBodyTextCorrect))
)
}
def checkBodyTextCorrect(m: Message) = {
// this assumes that the service just does an "uppercase" transform on the text
m match {
case tm: TextMessage => tm.getText == "text that should be in the reply message"
case _ => false
}
}
setUp(jmsScenario.inject(atOnceUsers(1)).protocols(jmsConfig));
jndi.properties
I had to add the jndi.properties to the classpath:
db_url=jdbc:oracle:thin:#host:1521:SID
pom.xml
Dependencies (maven):
<dependency>
<groupId>oracle</groupId>
<artifactId>aqapi</artifactId>
<version>11.2.0.3</version>
</dependency>
<dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc6</artifactId>
<version>11.2.0.2.0</version>
</dependency>
<dependency>
<groupId>javax.transaction</groupId>
<artifactId>jta</artifactId>
<version>1.1</version>
</dependency>
"contextFactory" is the class name of your ContextFactory. Doc seems to state it's "oracle.jms.AQjmsFactory".
"connectionFactoryName" is the key used for the JNDI look-up. Doc again seems to state it's "cn=OracleDBConnections".
Related
I have been trying to understand how Surefire plugin internally decides which Testing Framework to use ( TestNG, Jupiter, Junit4 etc. )
Does it use reflection and try to find the presence of each framework in the classpath ?
( Looking at the dependencies, Surefire seems to be coming with junit4 in its transitive dependencies - junit:JUnit:jar:4.12 )
It's possible to pass the provider (test-framework type) explicitly, setting an additional plugin dependency, e.g. for TestNG:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.0.0-M5</version>
<dependencies>
<dependency>
<groupId>org.apache.maven.surefire</groupId>
<artifactId>surefire-testng</artifactId>
<version>3.0.0-M5</version>
</dependency>
</dependencies>
</plugin>
If nothing specified
Surefire normally automatically selects which test-framework provider to use based on the version of TestNG/JUnit present in your project's classpath.
From this doc:
https://maven.apache.org/surefire/maven-surefire-plugin/examples/providers.html
How Surefire plugin internally decides which Testing Framework to use
Let's look at how it's implemented.
There is ProviderInfo interface with method boolean isApplicable();
ProviderInfo.java
I've found multiple implementations in class AbstractSurefireMojo.java
AbstractSurefireMojo.java
for:
TestNgProviderInfo
JUnit3ProviderInfo
JUnit4ProviderInfo
JUnitPlatformProviderInfo
JUnitCoreProviderInfo
DynamicProviderInfo
And there is also a protected method protected List<ProviderInfo> createProviders( TestClassPath testClasspath ) which reference all this implementations.
protected List<ProviderInfo> createProviders( TestClassPath testClasspath )
throws MojoExecutionException
{
Artifact junitDepArtifact = getJunitDepArtifact();
return providerDetector.resolve( new DynamicProviderInfo( null ),
new JUnitPlatformProviderInfo( getJUnit5Artifact(), testClasspath ),
new TestNgProviderInfo( getTestNgArtifact() ),
new JUnitCoreProviderInfo( getJunitArtifact(), junitDepArtifact ),
new JUnit4ProviderInfo( getJunitArtifact(), junitDepArtifact ),
new JUnit3ProviderInfo() );
}
and ProviderDetector class invokes isApplicable() per each providerInfo in resolve method.
ProviderDetector.java
And looks like the first applicable is selected:
private Optional<ProviderInfo> autoDetectOneWellKnownProvider( ProviderInfo... wellKnownProviders )
{
Optional<ProviderInfo> providerInfo = stream( wellKnownProviders )
.filter( ProviderInfo::isApplicable )
.findFirst();
providerInfo.ifPresent( p -> logger.info( "Using auto detected provider " + p.getProviderName() ) );
return providerInfo;
}
i have a naven application in which i use Birt 4.6. Below my dependencies.
<dependency>
<groupId>org.eclipse.birt.ojdbc</groupId>
<artifactId>odajdbc</artifactId>
<version>4.6.0-201606072122</version>
</dependency>
<dependency>
<groupId>org.eclipse.birt.runtime</groupId>
<artifactId>org.eclipse.birt.runtime</artifactId>
<version>4.6.0-20160607</version>
<exclusions>
<exclusion>
<groupId>org.eclipse.birt.runtime</groupId>
<artifactId>org.apache.xerces</artifactId>
</exclusion>
<exclusion>
<artifactId>org.apache.poi</artifactId>
<groupId>org.eclipse.birt.runtime</groupId>
</exclusion>
</exclusions>
</dependency>
I am able to connect with the database and generate the reports. Those are the good news.
Unfortunately, i noticed in my log file that there is an exception thrown. The exception can seen below
2017-01-10 14:57:15,446 SEVERE [org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager] (default task-6) DriverClassLoader failed to load class: oracle.jdbc.driver.OracleDriver: java.lang.ClassNotFoundException: oracle.jdbc.driver.OracleDriver
at org.eclipse.birt.core.framework.URLClassLoader.findClass1(URLClassLoader.java:188)
at org.eclipse.birt.core.framework.URLClassLoader$1.run(URLClassLoader.java:156)
at org.eclipse.birt.core.framework.URLClassLoader$1.run(URLClassLoader.java:1)
at java.security.AccessController.doPrivileged(Native Method)
at org.eclipse.birt.core.framework.URLClassLoader.findClass(URLClassLoader.java:151)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager.loadExtraDriver(JDBCDriverManager.java:1064)
at org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager.findDriver(JDBCDriverManager.java:859)
at org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager.loadAndRegisterDriver(JDBCDriverManager.java:986)
at org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager.loadAndRegisterDriver(JDBCDriverManager.java:958)
at org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager.doConnect(JDBCDriverManager.java:285)
at org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager.getConnection(JDBCDriverManager.java:236)
at org.eclipse.birt.report.data.oda.jdbc.Connection.connectByUrl(Connection.java:254)
at org.eclipse.birt.report.data.oda.jdbc.Connection.open(Connection.java:163)
at org.eclipse.datatools.connectivity.oda.consumer.helper.OdaConnection.open(OdaConnection.java:250)
at org.eclipse.birt.data.engine.odaconsumer.ConnectionManager.openConnection(ConnectionManager.java:165)
at org.eclipse.birt.data.engine.executor.DataSource.newConnection(DataSource.java:224)
at org.eclipse.birt.data.engine.executor.DataSource.open(DataSource.java:212)
at org.eclipse.birt.data.engine.impl.DataSourceRuntime.openOdiDataSource(DataSourceRuntime.java:217)
at org.eclipse.birt.data.engine.impl.QueryExecutor.openDataSource(QueryExecutor.java:437)
at org.eclipse.birt.data.engine.impl.QueryExecutor.prepareExecution(QueryExecutor.java:325)
at org.eclipse.birt.data.engine.impl.PreparedQuery.doPrepare(PreparedQuery.java:463)
at org.eclipse.birt.data.engine.impl.PreparedDataSourceQuery.produceQueryResults(PreparedDataSourceQuery.java:190)
at org.eclipse.birt.data.engine.impl.PreparedDataSourceQuery.execute(PreparedDataSourceQuery.java:178)
at org.eclipse.birt.data.engine.impl.PreparedOdaDSQuery.execute(PreparedOdaDSQuery.java:179)
at org.eclipse.birt.report.data.adapter.impl.DataRequestSessionImpl.execute(DataRequestSessionImpl.java:651)
at org.eclipse.birt.report.engine.data.dte.DteDataEngine.doExecuteQuery(DteDataEngine.java:152)
at org.eclipse.birt.report.engine.data.dte.AbstractDataEngine.execute(AbstractDataEngine.java:286)
at org.eclipse.birt.report.engine.executor.ExecutionContext.executeQuery(ExecutionContext.java:1947)
at org.eclipse.birt.report.engine.executor.QueryItemExecutor.executeQuery(QueryItemExecutor.java:80)
at org.eclipse.birt.report.engine.executor.TableItemExecutor.execute(TableItemExecutor.java:62)
at org.eclipse.birt.report.engine.internal.executor.dup.SuppressDuplicateItemExecutor.execute(SuppressDuplicateItemExecutor.java:43)
at org.eclipse.birt.report.engine.internal.executor.wrap.WrappedReportItemExecutor.execute(WrappedReportItemExecutor.java:46)
at org.eclipse.birt.report.engine.internal.executor.l18n.LocalizedReportItemExecutor.execute(LocalizedReportItemExecutor.java:34)
at org.eclipse.birt.report.engine.layout.html.HTMLBlockStackingLM.layoutNodes(HTMLBlockStackingLM.java:65)
at org.eclipse.birt.report.engine.layout.html.HTMLPageLM.layout(HTMLPageLM.java:92)
at org.eclipse.birt.report.engine.layout.html.HTMLReportLayoutEngine.layout(HTMLReportLayoutEngine.java:100)
at org.eclipse.birt.report.engine.api.impl.RunAndRenderTask.doRun(RunAndRenderTask.java:181)
at org.eclipse.birt.report.engine.api.impl.RunAndRenderTask.run(RunAndRenderTask.java:77)
For some reason the JDBCDriverManager struggles to find the correct driver, throws the exception , finally finds the driver connects to the database and generates the report.
I did a debug on JDBCDriverManager and hope that the information below does help a bit.
The app goes through the doConnect() function JDBCDriverManager. Inside there the Connection getJndiDSConnection( driverClass, jndiNameUrl, connectionProperties ); returns null . Same happens for the getJndiDSConnection in the doConnect. also returns null
Then the loadAndRegisterDriver( driverClass, driverClassPath ); is called with following arguments oracle.jdbc.driver.OracleDriver and null
Inside the loadAndRegisterDriver the findDriver( className, driverClassPath, refreshClassLoader ) is called with following arguments oracle.jdbc.driver.OracleDriver , null, false
On the next step driverClass = loadExtraDriver( className, true, refresh, driverClassPath ); is called with oracle.jdbc.driver.OracleDriver , true , false , null which throws the ClassNotFoundException mentioned above.
Final step, we are still inside findDriver method where the driver = this.getDriverInstance( driverClass, refresh ); method is called and finally returns oracle.jdbc.driver.OracleDriver .
After step 5 everything works fine. As i mentioned, the exception appears only one time and still the connection with the database is created and the reports are generated. After that, no matter how many times i create a report, the exception is never thrown again.
I would like here to add some further info about the findDriver method. This method tries in several ways to get the driver. First is
// Driver not in plugin class path; find it in drivers directory
driverClass = loadExtraDriver( className, true, refresh, driverClassPath );
Which returns null and then gives a try to get the driver from the context
driverClass = Class.forName( className, true, Thread.currentThread( ).getContextClassLoader());
This times it finally achieves to retrieve the driver.
What am i missing? It is clear that it cannot load it from the plugins since i do not have any plugins directory. Is there a way to overcome this exception ?
As Mark mentioned, there was no need to add as a dependency the org.eclipse.birt.ojdbc . I stopped using the org.eclipse.birt.report.data.oda.jdbc_4.6.0.v201606072122.jar and used my local ojdbc driver.
<dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc6</artifactId>
<version>11.2.0.4.0</version>
</dependency>
The above fixes the exception we get on the first try to load the driver.
Adding ojdbc7.jar under the WEB-INF path of the Birt Viewer folder (Web/App Server side) solved the problem for me:
[1] ../lib
[2] ../platform/plugins/org.eclipse.birt.report.data.oda.jdbc_<VERSION>/drivers
Logs
Before adding [2] above (was having only [1]):
20-Mar-2017 14:12:26.752 SEVERE [http-nio-8080-exec-4] org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager.loadExtraDriver DriverClassLoader failed to load class: oracle.jdbc.driver.OracleDriver java.lang.ClassNotFoundException: oracle.jdbc.driver.OracleDriver
After adding [2] above (was having only [1]):
20-Mar-2017 14:49:42.196 INFO [http-nio-8080-exec-4] org.eclipse.birt.report.data.oda.jdbc.JDBCDriverManager$DriverClassLoader.refreshFileURL JDBCDriverManager: found JAR file ojdbc7.jar. URL=file:../WEB-INF/platform/plugins/org.eclipse.birt.report.data.oda.jdbc_4.6.0.v201606072122/drivers/ojdbc7.jar
I have created a single queue with daily rolling. On the next day, I can't read the latest appended message. I found that the tailer index doesn't move to the latest cycle automatically after reading all messages in the previous cycle. By the way the java process was shut down at night and restarted on the next day.
I use Chronicle Queue V4.52.
Thanks.
This should work, we have tests which show messages are read from one cycle to the next.
Would you be able to include a test which reproduces this. There are quite a few unit tests you can use as examples.
this should now be fixed in the latest version
<dependency>
<groupId>net.openhft</groupId>
<artifactId>chronicle-bom</artifactId>
<version>1.13.15</version>
<type>pom</type>
<scope>import</scope>
</dependency>
or if you prefer
<dependency>
<groupId>net.openhft</groupId>
<artifactId>chronicle-queue</artifactId>
<version>4.5.7</version>
</dependency>
also see test case net.openhft.chronicle.queue.impl.single.SingleChronicleQueueTest#testReadingWritingWhenCycleIsSkipped
#Test
public void testReadingWritingWhenCycleIsSkipped() throws Exception {
final Path dir = Files.createTempDirectory("demo");
final RollCycles rollCycle = RollCycles.TEST_SECONDLY;
// write first message
try (ChronicleQueue queue = ChronicleQueueBuilder
.single(dir.toString())
.rollCycle(rollCycle).build()) {
queue.acquireAppender().writeText("first message");
}
Thread.sleep(2100);
// write second message
try (ChronicleQueue queue = ChronicleQueueBuilder
.single(dir.toString())
.rollCycle(rollCycle).build()) {
queue.acquireAppender().writeText("second message");
}
// read both messages
try (ChronicleQueue queue = ChronicleQueueBuilder
.single(dir.toString())
.rollCycle(rollCycle).build()) {
ExcerptTailer tailer = queue.createTailer();
Assert.assertEquals("first message", tailer.readText());
Assert.assertEquals("second message", tailer.readText());
}
}
Using JCR apis while I am trying to add node and property.I am getting the following error:
7520 [main] ERROR org.apache.jackrabbit.jcr2spi.hierarchy.ChildNodeEntriesImpl - ChildInfo iterator contains multiple entries with the same name|index or uniqueID -> ignore ChildNodeInfo.
I have added the following dependency in Pom.xml:
<dependency>
<groupId>org.apache.jackrabbit</groupId>
<artifactId>jackrabbit-jcr-commons</artifactId>
<version>2.12.1</version></dependency>
<dependency>
<groupId>org.apache.jackrabbit</groupId>
<artifactId>jackrabbit-jcr2dav</artifactId>
<version>2.0-beta6</version> </dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.5.8</version></dependency>
Java code:
package com.adobe.cq.impl;
import javax.jcr.Node;
import javax.jcr.Repository;
import javax.jcr.Session;
import javax.jcr.SimpleCredentials;
import org.apache.jackrabbit.commons.JcrUtils;
public class GetRepository {
public static void main(String[] args) {
try {
Repository repository = JcrUtils.getRepository("http://localhost:4502/crx/server");
Session session=repository.login(new SimpleCredentials("admin", "admin".toCharArray()));
Node root=session.getRootNode();
Node adobe = root.addNode("adobe");
Node day = adobe.addNode("cq");
day.setProperty("message", "Adobe Experience Manager is part of the Adobe Digital Marketing Suite!");
// Retrieve content
Node node = root.getNode("adobe/cq");
System.out.println(node.getPath());
System.out.println(node.getProperty("message").getString());
// Save the session changes and log out
session.save();
session.logout();
}
catch(Exception e){
e.printStackTrace();
}
}}
Same name siblings are not allowed in the repository. Going by your code, there is no check if the node "adobe" is already present below the root node. Hence, if the node is / was already created / present and the above code executes for the second time, you may face this issue.
Try checking for node availability as shown below.
Node adobe;
if (!root.hasNode("adobe")) {
adobe = root.addNode("adobe");
} else {
adobe = root.getNode("adobe");
}
if (!adobe.hasNode("cq")) {
Node day = adobe.addNode("cq");
}
I am using MRUnit to test Map Reduce code. I can't use .withInputValue as it is deprecated.
I could not locate an equivalent that works. setInputValue does not work either. What is the work-around?
Use withInput(). Exmaple (this. is with mrunit-1.0.0-hadoop2.jar )
MapDriver<LongWritable,Text,Text,IntWritable> mapDriver;
...
mapDriver.withInput(new LongWritable(), new Text("some line of text));
mapDriver.withOUtput(new Text("some key)); new IntWritable(..));
mapDriver.runTest();
Here's the maven dependency. Note the Hadoop2 classifier.
<dependency>
<groupId>org.apache.mrunit</groupId>
<artifactId>mrunit</artifactId>
<version>1.0.0</version>
<classifier>hadoop2</classifier>
</dependency>
For more info, see the tutorial