Apache Storm intellij local mode - NimbusLeaderNotFoundException - maven

I have setup a project trying to run standard "ExclamationTopology" on the in-memory version of the storm, triggered from the IntelliJ IDE. There goes my POM.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>storm</groupId>
<artifactId>sample</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<hbase.version>0.98.4-hadoop2</hbase.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-client</artifactId>
<version>2.0.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-server</artifactId>
<version>2.0.0-SNAPSHOT</version>
</dependency>
</dependencies>
<repositories>
<repository>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
<id>central</id>
<url>http://repo1.maven.org/maven2/</url>
</repository>
<repository>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
<id>clojars</id>
<url>https://clojars.org/repo/</url>
</repository>
</repositories>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.7.0</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.2.1</version>
<configuration>
<mainClass>test.ExclamationTopology</mainClass>
<arguments>-local</arguments>
</configuration>
</plugin>
</plugins>
</build>
</project>
Along with the sample source code of my topology:
public class ExclamationTopology extends ConfigurableTopology {
public static class ExclamationBolt extends BaseRichBolt {
OutputCollector _collector;
#Override
public void prepare(Map<String, Object> conf, TopologyContext context, OutputCollector collector) {
_collector = collector;
}
#Override
public void execute(Tuple tuple) {
_collector.emit(tuple, new Values(tuple.getString(0) + "!!!"));
_collector.ack(tuple);
}
#Override
public void declareOutputFields(OutputFieldsDeclarer declarer) {
declarer.declare(new Fields("word"));
}
}
public static void main(String[] args) throws Exception {
ConfigurableTopology.start(new ExclamationTopology(), args);
}
protected int run(String[] args) {
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout("word", new TestWordSpout(), 10);
builder.setBolt("exclaim1", new ExclamationBolt(), 3).shuffleGrouping("word");
builder.setBolt("exclaim2", new ExclamationBolt(), 2).shuffleGrouping("exclaim1");
conf.setDebug(true);
String topologyName = "test";
conf.setNumWorkers(3);
if (args != null && args.length > 0) {
topologyName = args[0];
}
return submit(topologyName, conf, builder);
}
}
In order to be able to run the topology locally from within my IDE via Maven I included exec maven plugin. Then I use the following mvn command to run the application:
exec:java -Dexec.args=-local
However, I do get the following exception:
java.lang.RuntimeException: java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused: connect
at org.apache.storm.security.auth.ThriftClient.reconnect(ThriftClient.java:110) ~[storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.security.auth.ThriftClient.<init>(ThriftClient.java:70) ~[storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.utils.NimbusClient.<init>(NimbusClient.java:158) ~[storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.utils.NimbusClient.getConfiguredClientAs(NimbusClient.java:113) [storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.utils.NimbusClient.getConfiguredClient(NimbusClient.java:83) [storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.blobstore.NimbusBlobStore.prepare(NimbusBlobStore.java:268) [storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.StormSubmitter.getListOfKeysFromBlobStore(StormSubmitter.java:599) [storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.StormSubmitter.validateConfs(StormSubmitter.java:565) [storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.StormSubmitter.submitTopologyAs(StormSubmitter.java:211) [storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.StormSubmitter.submitTopology(StormSubmitter.java:391) [storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.StormSubmitter.submitTopology(StormSubmitter.java:163) [storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.topology.ConfigurableTopology.submit(ConfigurableTopology.java:94) [storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at test.ExclamationTopology.run(ExclamationTopology.java:69) [classes/:?]
at org.apache.storm.topology.ConfigurableTopology.start(ConfigurableTopology.java:70) [storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at test.ExclamationTopology.main(ExclamationTopology.java:47) [classes/:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
at org.codehaus.mojo.exec.ExecJavaMojo$1.run(ExecJavaMojo.java:297) [exec-maven-plugin-1.2.1.jar:?]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
Caused by: java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused: connect
at org.apache.storm.security.auth.TBackoffConnect.retryNext(TBackoffConnect.java:64) ~[storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.security.auth.TBackoffConnect.doConnectWithRetry(TBackoffConnect.java:56) ~[storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.security.auth.ThriftClient.reconnect(ThriftClient.java:102) ~[storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
... 20 more
Caused by: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused: connect
at org.apache.thrift.transport.TSocket.open(TSocket.java:226) ~[libthrift-0.9.3.jar:0.9.3]
at org.apache.thrift.transport.TFramedTransport.open(TFramedTransport.java:81) ~[libthrift-0.9.3.jar:0.9.3]
at org.apache.storm.security.auth.SimpleTransportPlugin.connect(SimpleTransportPlugin.java:105) ~[storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.security.auth.TBackoffConnect.doConnectWithRetry(TBackoffConnect.java:53) ~[storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.security.auth.ThriftClient.reconnect(ThriftClient.java:102) ~[storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
... 20 more
Caused by: java.net.ConnectException: Connection refused: connect
at java.net.DualStackPlainSocketImpl.connect0(Native Method) ~[?:1.8.0_112]
at java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:79) ~[?:1.8.0_112]
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) ~[?:1.8.0_112]
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) ~[?:1.8.0_112]
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) ~[?:1.8.0_112]
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:172) ~[?:1.8.0_112]
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) ~[?:1.8.0_112]
at java.net.Socket.connect(Socket.java:589) ~[?:1.8.0_112]
at org.apache.thrift.transport.TSocket.open(TSocket.java:221) ~[libthrift-0.9.3.jar:0.9.3]
at org.apache.thrift.transport.TFramedTransport.open(TFramedTransport.java:81) ~[libthrift-0.9.3.jar:0.9.3]
at org.apache.storm.security.auth.SimpleTransportPlugin.connect(SimpleTransportPlugin.java:105) ~[storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.security.auth.TBackoffConnect.doConnectWithRetry(TBackoffConnect.java:53) ~[storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.apache.storm.security.auth.ThriftClient.reconnect(ThriftClient.java:102) ~[storm-client-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
... 20 more
org.apache.storm.utils.NimbusLeaderNotFoundException: Could not find leader nimbus from seed hosts [localhost]. Did you specify a valid list of nimbus hosts for config nimbus.seeds?
at org.apache.storm.utils.NimbusClient.getConfiguredClientAs(NimbusClient.java:141)
at org.apache.storm.utils.NimbusClient.getConfiguredClient(NimbusClient.java:83)
at org.apache.storm.blobstore.NimbusBlobStore.prepare(NimbusBlobStore.java:268)
at org.apache.storm.StormSubmitter.getListOfKeysFromBlobStore(StormSubmitter.java:599)
at org.apache.storm.StormSubmitter.validateConfs(StormSubmitter.java:565)

The README for storm-starter is out of date. The examples don't run locally anymore, because ConfigurableTopology was changed to not support this here https://github.com/apache/storm/commit/b254ede46a25466749cd48ebd4bcb56dd791ec4a#diff-de7eab133732a8b5b97be6aa7328e392R92.
If you want to run it locally, you can use https://github.com/apache/storm/blob/master/storm-server/src/main/java/org/apache/storm/LocalCluster.java, which should replace the call to submit in your topology code. Otherwise you'll have to set up a local Storm instance to run the topology (which is very easy, see https://storm.apache.org/releases/2.0.0-SNAPSHOT/Setting-up-a-Storm-cluster.html. The storm-starter README tells you how to submit to an installed cluster).
Edit:
If you want to run it locally, another option would probably be to use the "storm local" command.
PS E:\apache-storm-2.0.0-SNAPSHOT\bin> ./storm help local
Syntax: [storm local topology-jar-path class ...]
Runs the main method of class with the specified arguments but pointing to a local cluster
The storm jars and configs in ~/.storm are put on the classpath.
The process is configured so that StormSubmitter
(http://storm.apache.org/releases/current/javadocs/org/apache/storm/StormSubmitter.html)
and others will interact with a local cluster instead of the one configured by default.
Most options should work just like with the storm jar command.
local also adds in the option --local-ttl which sets the number of seconds the
local cluster will run for before it shuts down.
--java-debug lets you turn on java debugging and set the parameters passed to -agentlib:jdwp on the JDK
--java-debug transport=dt_socket,address=localhost:8000
will open up a debugging server on port 8000.
The documentation for local mode has been updated in the Storm repo, but hasn't yet made it to the website. See https://github.com/apache/storm/blob/master/docs/Local-mode.md for the new docs.

Related

How to configure embeded mongodb to run in given port in spring boot

I want to use embedded mongoDB with Spring boot for my development and testing. When I try to configure its getting started but always run with dynamic port instead of port which I give in URL config. Idea is I want to use same kind of MongoConfig for both my test (Unit/Integration) and real environment.
My property file looks below.
spring:
profiles: local
data:
mongodb:
uri: mongodb://localhost:62309/local_db
#Configuration
public class MongoDBConfig {
private final Environment env;
public MongoDBConfig(Environment env) {
this.env = env;
}
#Bean
public MongoDbFactory getMongoFactory() {
return new SimpleMongoClientDbFactory(env.getProperty("spring.data.mongodb.uri"));
}
#Bean
public MongoTemplate getMongoTemplate() {
return new MongoTemplate(
> getMongoFactory
());
}
}
pom.xml
--------
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.2.1.RELEASE</version>
</parent>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-mongodb</artifactId>
</dependency>
<dependency>
<groupId>de.flapdoodle.embed</groupId>
<artifactId>de.flapdoodle.embed.mongo</artifactId>
<!-- <scope>test</scope> -->
</dependency>
with this configuration app is getting started when I start it but with dynamic port every restart. Also seeing below exception trace in logs. From this seems like Spring is first try to connect given mongo instance from my local machine and its not finding any so its failing then its invoking emebeded mongodb.
Is there a way I can make spring to make use of configured port to run embedded mongoDB without that error being thrown.
Error Trace
[cluster-ClusterId{value='62141aecfb7ede3c51b3d064', description='null'}-localhost:62309] INFO org.mongodb.driver.cluster -
Exception in monitor thread while connecting to server localhost:62309
com.mongodb.MongoSocketOpenException: Exception opening socket
at com.mongodb.internal.connection.SocketStream.open(SocketStream.java:70)
at com.mongodb.internal.connection.InternalStreamConnection.open(InternalStreamConnection.java:128)
at com.mongodb.internal.connection.DefaultServerMonitor$ServerMonitorRunnable.run(DefaultServerMonitor.java:117)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.ConnectException: Connection refused: connect
at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method)
at java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:85)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:345)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:172)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at com.mongodb.internal.connection.SocketStreamHelper.initialize(SocketStreamHelper.java:64)
at com.mongodb.internal.connection.SocketStream.initializeSocket(SocketStream.java:79)
at com.mongodb.internal.connection.SocketStream.open(SocketStream.java:65)
... 3 common frames omitted

My storm bolt can not deserialize in cluster mode

I use springboot and storm to do a demo,it works in local mode,but report an error in cluster mode when i submit a jar
./storm jar storm-demo3-0.0.1-SNAPSHOT.jar org.springframework.boot.loader.JarLauncher simpleBoot
When i romove the springBoot and package with maven-compiler-plugin then it can work well
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
</plugins>
</build>
this is the error on supervisor
java.lang.RuntimeException: java.lang.ClassNotFoundException: com.fosung.share.stormdemo3.bolt.FilterBolt
at org.apache.storm.utils.Utils.javaDeserialize(Utils.java:259) ~[storm-core-1.2.2.jar:1.2.2]
at org.apache.storm.utils.Utils.getSetComponentObject(Utils.java:507) ~[storm-core-1.2.2.jar:1.2.2]
at org.apache.storm.daemon.task$get_task_object.invoke(task.clj:76) ~[storm-core-1.2.2.jar:1.2.2]
at org.apache.storm.daemon.task$mk_task_data$fn__6524.invoke(task.clj:180) ~[storm-core-1.2.2.jar:1.2.2]
at org.apache.storm.util$assoc_apply_self.invoke(util.clj:931) ~[storm-core-1.2.2.jar:1.2.2]
at org.apache.storm.daemon.task$mk_task_data.invoke(task.clj:172) ~[storm-core-1.2.2.jar:1.2.2]
at org.apache.storm.daemon.task$mk_task.invoke(task.clj:184) ~[storm-core-1.2.2.jar:1.2.2]
at org.apache.storm.daemon.executor$mk_executor$fn__10662.invoke(executor.clj:379) ~[storm-core-1.2.2.jar:1.2.2]
at clojure.core$map$fn__4553.invoke(core.clj:2622) ~[clojure-1.7.0.jar:?]
at clojure.lang.LazySeq.sval(LazySeq.java:40) ~[clojure-1.7.0.jar:?]
at clojure.lang.LazySeq.seq(LazySeq.java:49) ~[clojure-1.7.0.jar:?]
at clojure.lang.RT.seq(RT.java:507) ~[clojure-1.7.0.jar:?]
at clojure.core$seq__4128.invoke(core.clj:137) ~[clojure-1.7.0.jar:?]
at clojure.core.protocols$seq_reduce.invoke(protocols.clj:30) ~[clojure-1.7.0.jar:?]
at clojure.core.protocols$fn__6506.invoke(protocols.clj:101) ~[clojure-1.7.0.jar:?]
at clojure.core.protocols$fn__6452$G__6447__6465.invoke(protocols.clj:13) ~[clojure-1.7.0.jar:?]
at clojure.core$reduce.invoke(core.clj:6519) ~[clojure-1.7.0.jar:?]
at clojure.core$into.invoke(core.clj:6600) ~[clojure-1.7.0.jar:?]
at org.apache.storm.daemon.executor$mk_executor.invoke(executor.clj:380) ~[storm-core-1.2.2.jar:1.2.2]
at org.apache.storm.daemon.worker$fn__11300$exec_fn__2470__auto__$reify__11302$iter__11307__11311$fn__11312.invoke(worker.clj:663) ~[storm-core-1.2.2.jar:1.2.2]
at clojure.lang.LazySeq.sval(LazySeq.java:40) ~[clojure-1.7.0.jar:?]
at clojure.lang.LazySeq.seq(LazySeq.java:49) ~[clojure-1.7.0.jar:?]
at clojure.lang.RT.seq(RT.java:507) ~[clojure-1.7.0.jar:?]
at clojure.core$seq__4128.invoke(core.clj:137) ~[clojure-1.7.0.jar:?]
at clojure.core$dorun.invoke(core.clj:3009) ~[clojure-1.7.0.jar:?]
at clojure.core$doall.invoke(core.clj:3025) ~[clojure-1.7.0.jar:?]
at org.apache.storm.daemon.worker$fn__11300$exec_fn__2470__auto__$reify__11302.run(worker.clj:663) ~[storm-core-1.2.2.jar:1.2.2]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_152]
at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_152]
at org.apache.storm.daemon.worker$fn__11300$exec_fn__2470__auto____11301.invoke(worker.clj:633) ~[storm-core-1.2.2.jar:1.2.2]
at clojure.lang.AFn.applyToHelper(AFn.java:178) ~[clojure-1.7.0.jar:?]
at clojure.lang.AFn.applyTo(AFn.java:144) ~[clojure-1.7.0.jar:?]
at clojure.core$apply.invoke(core.clj:630) ~[clojure-1.7.0.jar:?]
at org.apache.storm.daemon.worker$fn__11300$mk_worker__11391.doInvoke(worker.clj:605) [storm-core-1.2.2.jar:1.2.2]
at clojure.lang.RestFn.invoke(RestFn.java:512) [clojure-1.7.0.jar:?]
at org.apache.storm.daemon.worker$_main.invoke(worker.clj:798) [storm-core-1.2.2.jar:1.2.2]
at clojure.lang.AFn.applyToHelper(AFn.java:165) [clojure-1.7.0.jar:?]
at clojure.lang.AFn.applyTo(AFn.java:144) [clojure-1.7.0.jar:?]
at org.apache.storm.daemon.worker.main(Unknown Source) [storm-core-1.2.2.jar:1.2.2]
Caused by: java.lang.ClassNotFoundException: com.fosung.share.stormdemo3.bolt.FilterBolt
at java.net.URLClassLoader.findClass(URLClassLoader.java:381) ~[?:1.8.0_152]
at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[?:1.8.0_152]
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338) ~[?:1.8.0_152]
at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[?:1.8.0_152]
at java.lang.Class.forName0(Native Method) ~[?:1.8.0_152]
at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_152]
at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:683) ~[?:1.8.0_152]
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1863) ~[?:1.8.0_152]
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1746) ~[?:1.8.0_152]
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2037) ~[?:1.8.0_152]
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568) ~[?:1.8.0_152]
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428) ~[?:1.8.0_152]
at org.apache.storm.utils.Utils.javaDeserialize(Utils.java:253) ~[storm-core-1.2.2.jar:1.2.2]
... 38 more
2019-05-22 11:09:14.684 o.a.s.util main [ERROR] Halting process: ("Error on initialization")
java.lang.RuntimeException: ("Error on initialization")
at org.apache.storm.util$exit_process_BANG_.doInvoke(util.clj:341) [storm-core-1.2.2.jar:1.2.2]
at clojure.lang.RestFn.invoke(RestFn.java:423) [clojure-1.7.0.jar:?]
at org.apache.storm.daemon.worker$fn__11300$mk_worker__11391.doInvoke(worker.clj:605) [storm-core-1.2.2.jar:1.2.2]
at clojure.lang.RestFn.invoke(RestFn.java:512) [clojure-1.7.0.jar:?]
at org.apache.storm.daemon.worker$_main.invoke(worker.clj:798) [storm-core-1.2.2.jar:1.2.2]
at clojure.lang.AFn.applyToHelper(AFn.java:165) [clojure-1.7.0.jar:?]
at clojure.lang.AFn.applyTo(AFn.java:144) [clojure-1.7.0.jar:?]
at org.apache.storm.daemon.worker.main(Unknown Source) [storm-core-1.2.2.jar:1.2.2]
my pom.xml
<properties>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-core</artifactId>
<version>1.2.2</version>
<!--<scope>provided</scope>-->
<exclusions>
<exclusion>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-slf4j-impl</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-1.2-api</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-web</artifactId>
</exclusion>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<artifactId>ring-cors</artifactId>
<groupId>ring-cors</groupId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
</plugins>
</build>
MyTopology
public class MyTopology {
public static void main(String[] args) {
System.out.println("MyTopology main start");
// 定义一个拓扑
TopologyBuilder builder = new TopologyBuilder();
// 设置1个Executeor(线程),默认一个
DataSpout dataSpout = new DataSpout();
builder.setSpout("spoutId", dataSpout);
// shuffleGrouping:表示是随机分组
// 设置1个Executeor(线程),和两个task
FilterBolt filterBolt = new FilterBolt();
InsertBolt insertBolt = new InsertBolt();
builder.setBolt("filterBolt", filterBolt).setNumTasks(1).allGrouping("spoutId", "spoutId");
builder.setBolt("insertBolt", insertBolt).setNumTasks(1).allGrouping("filterBolt", "spoutId");
Config conf = new Config();
try {
// 有参数时,表示向集群提交作业,并把第一个参数当做topology名称
// 没有参数时,本地提交
if (args != null && args.length > 0) {
System.out.println("运行远程模式");
StormSubmitter.submitTopology(args[0], conf, builder.createTopology());
} else {
// 启动本地模式
System.out.println("运行本地模式");
LocalCluster cluster = new LocalCluster();
cluster.submitTopology("TopologyApp", conf, builder.createTopology());
}
} catch (Exception e) {
System.out.println("storm启动失败!程序退出!");
System.exit(1);
e.printStackTrace();
}
// System.out.println("storm启动成功...");
}
}
My spout
public class DataSpout extends BaseRichSpout {
SpoutOutputCollector collector;
#Override
public void open(Map conf, TopologyContext context, SpoutOutputCollector collector) {
this.collector = collector;
System.out.println("spout open");
}
#Override
public void nextTuple() {
/*try {
Thread.sleep(1000);
return;
} catch (InterruptedException e) {
e.printStackTrace();
}*/
System.out.println("spout nextTuple start");
int rndomn = (int)Math.random() * 1000;
collector.emit("spoutId", new Values(rndomn));
try {
TimeUnit.SECONDS.sleep(3);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
#Override
public void declareOutputFields(OutputFieldsDeclarer declarer) {
declarer.declareStream("spoutId", new Fields("spoutId"));
}
}
My bolt
public class FilterBolt extends BaseRichBolt {
OutputCollector collector;
#Override
public void prepare(Map stormConf, TopologyContext context, OutputCollector collector) {
this.collector = collector;
}
#Override
public void execute(Tuple input) {
System.out.println("filter bolt start");
Integer o = (Integer) input.getValues().get(0);
if (o>10){
collector.emit("spoutId", new Values(o));
}
}
#Override
public void declareOutputFields(OutputFieldsDeclarer declarer) {
//定义下个bolt接收streamId
declarer.declareStream("spoutId", new Fields("spoutId"));
}
}
Spring (boot) doesn't fit nicely with Storm. Storm is a framework, meaning it is responsible for managing the lifecycle of some classes like your bolt. As Storm doesn't know anything about Spring, Spring's dependency injection doesn't work out of the box. It is possible to set up Spring to work on parts of a Storm application with e.g. task and worker hooks, which can allow you to create a Spring context in a Storm worker. I don't think I would recommend it unless you have a good reason to need Spring.
Regarding the error you're getting, Storm is failing to find one of your classes in the jar you're submitting. Since you didn't post your pom.xml for your Spring configuration, it's hard to tell, but maybe you're using a plugin that moves your classes around. When you submit a topology to Storm, Storm runs a couple of phases you should understand:
First you do storm jar com.yourcompany.yourMain. This starts a JVM on your local machine (or wherever you're running the command), which runs your topology setup, in your case MyTopology.main. The setup then serializes your spouts and bolts, and sends the jar and serialized topology to Nimbus (a separate JVM), which in turn sends it to the supervisors (yet another separate JVM). On the supervisors, the supervisor JVM boots up a number of worker JVMs to run your topology. Each worker JVM starts with a command like java -cp your-topology.jar org.apache.storm.Worker. The worker JVMs load the serialized topology, and the classes in your topology jar, and boot up threads to run your spouts and bolts.
These phases are most likely the reason it's failing for you. When you run the topology setup code, you're doing it with a Spring Boot command, so Spring Boot gets a chance to run. When the topology starts up on the worker machines, the JVMs are started with a regular old call to a non-Spring main method, so Spring doesn't get a chance to run.
If you decide not to use Spring, you can find a working example POM here.
Other links that may be of interest are an earlier answer and a project doing Spring integration for Storm.

KIE Workbench Integration in spring throw exception

What I want to do is to init the KieSession in Spring when the tomcat start. And The KieSession will dynamicly change when the rules are changed and redeployed as jar in the kie-workbench.
I downloaded the kie-drools-wb-6.3.0.Final-tomcat7.war and deployed it in a linux machine.I can use the following code to test:
public class TestScanMaven {
public static void main(String[] args) {
KieServices kieServices = KieServices.Factory.get();
try {
ReleaseId releaseId = kieServices.newReleaseId("com.test","epay-risk2", "LATEST");
KieContainer kieContainer = kieServices.newKieContainer(releaseId);
KieScanner kieScanner = kieServices.newKieScanner(kieContainer);
kieScanner.start(20000L);
Scanner scanner = new Scanner(System.in);
while (true) {
runRule(kieContainer);
System.out.println("Press enter in order to run the test again....");
scanner.nextLine();
}
} catch (Exception e) {
System.out
.println("Exception thrown while constructing InputStream");
System.out.println(e.getMessage());
}
}
private static void runRule(KieContainer kieContainer) {
KieSession kSession = kieContainer.newKieSession("defaultKieSession");
// go !
Message message = new Message();
message.setMessage("Hello World");
message.setStatus(Message.HELLO);
kSession.insert(message);
kSession.fireAllRules();
}
}
this code work quite well。
But when I change to use the following code in spring,It throw exception
#PostConstruct
public void init(){
KieServices kieServices = KieServices.Factory.get();
ReleaseId releaseId = kieServices.newReleaseId("com.test","epay-risk2", "LATEST");
KieContainer kieContainer = kieServices.newKieContainer(releaseId);
kSession = kieContainer.newKieSession("defaultKieSession");
kSession.setGlobal("riskRecordService", riskRecordService);
KieScanner kieScanner = kieServices.newKieScanner(kieContainer);
kieScanner.start(20000L);
}
the Exception thrown:
Caused by: java.lang.NoSuchMethodError: com.google.inject.Binder.bindListener(Lcom/google/inject/matcher/Matcher;[Lcom/google/inject/spi/ProvisionListener;)V
at org.eclipse.sisu.plexus.PlexusBindingModule.configure(PlexusBindingModule.java:75)
at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:223)
at com.google.inject.spi.Elements.getElements(Elements.java:101)
at com.google.inject.spi.Elements.getElements(Elements.java:92)
at org.eclipse.sisu.wire.WireModule.configure(WireModule.java:75)
at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:223)
at com.google.inject.spi.Elements.getElements(Elements.java:101)
at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:133)
at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:103)
at com.google.inject.Guice.createInjector(Guice.java:95)
at com.google.inject.Guice.createInjector(Guice.java:72)
at com.google.inject.Guice.createInjector(Guice.java:62)
at org.codehaus.plexus.DefaultPlexusContainer.addPlexusInjector(DefaultPlexusContainer.java:477)
at org.codehaus.plexus.DefaultPlexusContainer.<init>(DefaultPlexusContainer.java:203)
at org.codehaus.plexus.DefaultPlexusContainer.<init>(DefaultPlexusContainer.java:167)
at org.kie.scanner.embedder.MavenEmbedderUtils.buildPlexusContainer(MavenEmbedderUtils.java:166)
at org.kie.scanner.embedder.MavenEmbedderUtils.buildPlexusContainer(MavenEmbedderUtils.java:140)
at org.kie.scanner.embedder.PlexusComponentProvider.<init>(PlexusComponentProvider.java:37)
at org.kie.scanner.embedder.MavenEmbedderUtils.buildComponentProvider(MavenEmbedderUtils.java:56)
at org.kie.scanner.embedder.MavenEmbedder.<init>(MavenEmbedder.java:75)
at org.kie.scanner.embedder.MavenEmbedder.<init>(MavenEmbedder.java:69)
at org.kie.scanner.embedder.MavenProjectLoader.parseMavenPom(MavenProjectLoader.java:55)
at org.kie.scanner.embedder.MavenProjectLoader.parseMavenPom(MavenProjectLoader.java:49)
at org.kie.scanner.MavenPomModelGenerator.parse(MavenPomModelGenerator.java:36)
at org.drools.compiler.kproject.xml.PomModel$Parser.parse(PomModel.java:89)
at org.drools.compiler.kie.builder.impl.AbstractKieModule.getPomModel(AbstractKieModule.java:395)
at org.drools.compiler.kie.builder.impl.AbstractKieModule.getJarDependencies(AbstractKieModule.java:126)
at org.kie.scanner.MavenClassLoaderResolver.getClassLoader(MavenClassLoaderResolver.java:64)
at org.drools.compiler.kie.builder.impl.KieModuleKieProject.<init>(KieModuleKieProject.java:68)
at org.drools.compiler.kie.builder.impl.KieModuleKieProject.<init>(KieModuleKieProject.java:56)
at org.drools.compiler.kie.builder.impl.KieBuilderImpl.buildKieModule(KieBuilderImpl.java:221)
at org.kie.scanner.KieRepositoryScannerImpl.build(KieRepositoryScannerImpl.java:220)
at org.kie.scanner.KieRepositoryScannerImpl.buildArtifact(KieRepositoryScannerImpl.java:170)
at org.kie.scanner.KieRepositoryScannerImpl.loadArtifact(KieRepositoryScannerImpl.java:126)
at org.kie.scanner.KieRepositoryScannerImpl.loadArtifact(KieRepositoryScannerImpl.java:121)
at org.drools.compiler.kie.builder.impl.KieRepositoryImpl.loadKieModuleFromMavenRepo(KieRepositoryImpl.java:129)
at org.drools.compiler.kie.builder.impl.KieRepositoryImpl.getKieModule(KieRepositoryImpl.java:115)
at org.drools.compiler.kie.builder.impl.KieRepositoryImpl.getKieModule(KieRepositoryImpl.java:92)
at org.drools.compiler.kie.builder.impl.KieServicesImpl.newKieContainer(KieServicesImpl.java:115)
at org.drools.compiler.kie.builder.impl.KieServicesImpl.newKieContainer(KieServicesImpl.java:111)
at com.test.risk.service.risk.epay.rule.impl.RiskKieServiceImpl.init(RiskKieServiceImpl.java:52)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleElement.invoke(InitDestroyAnnotationBeanPostProcessor.java:344)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor$LifecycleMetadata.invokeInitMethods(InitDestroyAnnotationBeanPostProcessor.java:295)
at org.springframework.beans.factory.annotation.InitDestroyAnnotationBeanPostProcessor.postProcessBeforeInitialization(InitDestroyAnnotationBeanPostProcessor.java:130)
The settings in my maven home is:
<?xml version="1.0" encoding="UTF-8"?>
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0 http://maven.apache.org/xsd/settings-1.0.0.xsd">
<servers>
<server>
<id>guvnor-m2-repo</id>
<username>drools_tomcat</username>
<password>XXXXXX</password>
<configuration>
<wagonProvider>httpclient</wagonProvider>
<httpConfiguration>
<all>
<usePreemptive>true</usePreemptive>
</all>
</httpConfiguration>
</configuration>
</server>
</servers>
<profiles>
<profile>
<id>guvnor-m2-repo</id>
<repositories>
<repository>
<id>guvnor-m2-repo</id>
<name>BRMS Repository</name>
<url>http://xxx.xxx.xxx.xxx:8082/guvnor/maven2/</url>
<layout>default</layout>
<releases>
<enabled>true</enabled>
<updatePolicy>always</updatePolicy>
</releases>
<snapshots>
<enabled>true</enabled>
<updatePolicy>always</updatePolicy>
</snapshots>
</repository>
</repositories>
</profile>
</profiles>
<activeProfiles>
<activeProfile>guvnor-m2-repo</activeProfile>
</activeProfiles>
</settings>

gwt-test-utils unit fail when run with jacoco

We are attempting to generate coverage reports for our GWT application from a set of unit tests written using gwt-test-utils. The project is a multi-module maven project. We are using the sonar-plugin on jenkins in order to generate and collate our coverage and violation information.
When the build jobs run all the GWT unit tests pass as part of the normal build, however when the Sonar plugin attempts to rerun the tests they all fail with the following error:
initializationError(uk.co.card.gwt.retailpost.client.dialog.productmodify.CurrencyEditDialogTest) Time elapsed: 0 sec <<< ERROR!
com.googlecode.gwt.test.exceptions.GwtTestException: Error while generating gwt-test-utils prerequisites
at com.googlecode.gwt.test.internal.GwtFactory.(GwtFactory.java:113)
at com.googlecode.gwt.test.internal.GwtFactory.initializeIfNeeded(GwtFactory.java:45)
at com.googlecode.gwt.test.internal.junit.AbstractGwtRunner.(AbstractGwtRunner.java:30)
at com.googlecode.gwt.test.GwtRunner.(GwtRunner.java:19)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at org.junit.internal.builders.AnnotatedBuilder.buildRunner(AnnotatedBuilder.java:29)
at org.junit.internal.builders.AnnotatedBuilder.runnerForClass(AnnotatedBuilder.java:21)
at org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
at org.junit.internal.builders.AllDefaultPossibilitiesBuilder.runnerForClass(AllDefaultPossibilitiesBuilder.java:26)
at org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java:59)
at org.junit.internal.requests.ClassRequest.getRunner(ClassRequest.java:26)
at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:250)
at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:141)
at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:112)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:189)
at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:165)
at org.apache.maven.surefire.booter.ProviderFactory.invokeProvider(ProviderFactory.java:85)
at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:115)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:75)
Caused by: com.google.gwt.core.ext.UnableToCompleteException: (see previous log entries)
at com.google.gwt.dev.cfg.ModuleDef.checkForSeedTypes(ModuleDef.java:559)
at com.google.gwt.dev.cfg.ModuleDef.getCompilationState(ModuleDef.java:363)
at com.google.gwt.dev.cfg.ModuleDef.getCompilationState(ModuleDef.java:354)
at com.googlecode.gwt.test.internal.GwtFactory.createCompilationState(GwtFactory.java:151)
at com.googlecode.gwt.test.internal.GwtFactory.(GwtFactory.java:106)
... 25 more
Looking through the rest of the console output from jenkins, and the workspace directories, I can't locate the log file that the "com.google.gwt.core.ext.UnableToCompleteException: (see previous log entries)" refers to.
Has anyone encountered a similar problem and knows how to get Sonar to run the gwt-test-utils successfully, or at least will have an idea when to look to find the previous log entries mentioned in the exception.
EDIT: After further experimentation the issue appears to be being caused by jacoco. trying to run just the unit tests instrumented with jacoco (and with no sonar involved) results in the same error
**EDIT:
sample from pom.xml
<build>
pluginManagement>
<plugins>
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>0.6.2.201302030002</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12.4</version>
<configuration>
<excludedGroups combine.self="override" />
<reuseForks>true</reuseForks>
<argLine>-Xmx1024m -XX:MaxPermSize=256m ${jacoco.agent.argLine}</argLine>
</configuration>
</plugin>
</plugins>
</pluginManagement>
<plugins>
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<configuration>
<propertyName>jacoco.agent.argLine</propertyName>
<destFile>${sonar.jacoco.itReportPath}</destFile>
<append>true</append>
<excludes>
<exclude>*.osgi.*</exclude>
<exclude>*.apache.*</exclude>
<exclude>*.sourceforge.*</exclude>
<exclude>*.junit.*</exclude>
<!-- Test support code does not need to be covered -->
<exclude>uk.co.card.retailpost.clientintegration.utilities.*</exclude>
</excludes>
<classDumpDir>temp/classes</classDumpDir>
</configuration>
<executions>
<execution>
<id>agent</id>
<goals>
<goal>prepare-agent</goal>
</goals>
</execution>
<execution>
<id>report</id>
<goals>
<goal>report</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
As I mentioned in comments libraries are loaded in different order for jacoco with maven-surefire-plugin. To solve this problem write your own runner (extends com.googlecode.gwt.test.GwtRunner) and change classloader for thread contextClassLoader.
import com.googlecode.gwt.test.GwtRunner;
public class MyGwtRunner extends GwtRunner {
static {
URLClassLoader classLoader = (URLClassLoader) MyGwtRunner.class.getClassLoader();
try {
URL[] urls = getClassPath();
ClassLoader cl = URLClassLoader.newInstance(urls, classLoader);
Thread.currentThread().setContextClassLoader(cl);
} catch (MalformedURLException e) {
throw new IllegalStateException(e);
}
}
public MyGwtRunner(Class<?> clazz) throws Throwable {
super(clazz);
}
private static URL[] getClassPath() throws MalformedURLException {
String classPath = System.getProperty("java.class.path");
String pathSeparator = System.getProperty("path.separator");
String[] array = classPath.split(pathSeparator);
List<URL> files = new ArrayList<URL>();
for (String a : array) {
files.add(new File(a).toURI().toURL());
}
return files.toArray(new URL[files.size()]);
}
}
In your tests override GwtRunner by MyGwtRunner
#GwtModule("com.my.module.GwtTestUtils")
#RunWith(MyGwtRunner.class)
public abstract class AbstractGwtJunit extends GwtTest {
....
}

has MetaData yet the class cant be found

Employee.java
#Entity
#Table(name="EMPLOYEE",
uniqueConstraints = { #UniqueConstraint(columnNames={"EMPNO"}) })
public class Employee implements Serializable {
#Id
#Column(name="EMPNO")
private String empNo;
#Column(name="FIRSTNAME")
private String firstName;
#Column(name="midInit")
private char midInit;
#Column(name="LASTNAME")
private String lastName;
#OneToMany(fetch=FetchType.LAZY, mappedBy = "id.employee")
Set<EmployeeSchedule> schedules;
.....
Schedule.java
#Entity
#Table(name="SCHEDULE_SLOT", uniqueConstraints = { #UniqueConstraint
(columnNames= {"SLOTNO"})})
public class ScheduleSlot implements Serializable{
#Id #Column(name="SLOTNO")
private String slotNo;
#Column(name="SLOTSTART")
private Date slotStart;
#Column(name="SLOTEND")
private Date slotEnd;
#OneToMany(fetch=FetchType.LAZY, mappedBy="id.schedule")
private Set<Employee> employees ;//= new HashSet<Employee>() ;
....
EmployeeSchedule.java
#Entity
#Table(name="EMPLOYEE_SCHEDULE")
#AssociationOverrides(
{ #AssociationOverride
( name = "id.employee",
joinColumns = #JoinColumn(name = "empNo")),
#AssociationOverride
( name = "id.schedule",
joinColumns = #JoinColumn(name = "slotNo"))
})
public class EmployeeSchedule {
#EmbeddedId
EmployeeSchedulePK id;
#Column(name="available")
private boolean available;
#Column(name="UPDATEDON")
private Date updatedOn;
#Column(name="UPDATEDBY")
#ManyToOne(fetch=FetchType.LAZY)
#JoinColumn(name="updatedBy")
private String updatedBy;
public EmployeeSchedule() {}
public EmployeeSchedule(EmployeeSchedulePK id) {
this.id = id;
}
......
EmployeeSchedulePK .java
#Embeddable
public class EmployeeSchedulePK implements Serializable {
#ManyToOne
private Employee employee;
#ManyToOne
private ScheduleSlot schedule;
public EmployeeSchedulePK() {
}
...
When doing maven install I get the following error. Please advice.. thanks
DataNucleus Enhancer (version 3.1.0.release) : Enhancement of classes
Class "com.co.dsp.iwork.entity.EmployeeSchedule" has MetaData yet the class cant be found. Please check your CLASSPATH specifications.
DataNucleus Enhancer completed with an error. Please review the enhancer log for full details. Some classes may have been enhanced but some caused errors
the dataneucleaus log shows: -
Class "com.co.dsp.iwork.entity.EmployeeSchedule" : Populating Meta-Data
13:22:59,059 (main) DEBUG [DataNucleus.MetaData] - Class "com.co.dsp.iwork.entity.EmployeeSchedule" field "employee" : adding Meta-Data for field embedded in class "com.co.dsp.iwork.entity.EmployeeSchedulePK" since it didnt appear in the Meta-Data definition.
13:22:59,059 (main) DEBUG [DataNucleus.MetaData] - Class "com.co.dsp.iwork.entity.EmployeeSchedule" field "schedule" : adding Meta-Data for field embedded in class "com.co.dsp.iwork.entity.EmployeeSchedulePK" since it didnt appear in the Meta-Data definition.
13:22:59,060 (main) ERROR [DataNucleus.MetaData] - *Class "com.co.dsp.iwork.entity.#UNKNOWN.id" has MetaData yet the class cant be found. Please check your CLASSPATH specifications.*
13:22:59,061 (main) DEBUG [DataNucleus.MetaData] - org.datanucleus.metadata.InvalidClassMetaDataException: Class "com.co.dsp.iwork.entity.EmployeeSchedule" has MetaData yet the class cant be found. Please check your CLASSPATH specifications.
13:22:59,063 (main) ERROR [DataNucleus.Enhancer] - DataNucleus Enhancer completed with an error. Please review the enhancer log for full details. Some classes may have been enhanced but some caused errors
Class "com.co.dsp.iwork.entity.EmployeeSchedule" has MetaData yet the class cant be found. Please check your CLASSPATH specifications.
org.datanucleus.metadata.InvalidClassMetaDataException: Class "com.co.dsp.iwork.entity.EmployeeSchedule" has MetaData yet the class cant be found. Please check your CLASSPATH specifications.
at org.datanucleus.metadata.ClassMetaData.populateMemberMetaData(ClassMetaData.java:464)
at org.datanucleus.metadata.ClassMetaData.populate(ClassMetaData.java:210)
at org.datanucleus.metadata.MetaDataManager$1.run(MetaDataManager.java:2699)
at java.security.AccessController.doPrivileged(Native Method)
at org.datanucleus.metadata.MetaDataManager.populateAbstractClassMetaData(MetaDataManager.java:2693)
at org.datanucleus.metadata.MetaDataManager.populateFileMetaData(MetaDataManager.java:2516)
at org.datanucleus.metadata.MetaDataManager.initialiseFileMetaDataForUse(MetaDataManager.java:1123)
at org.datanucleus.metadata.MetaDataManager.loadPersistenceUnit(MetaDataManager.java:986)
at org.datanucleus.enhancer.DataNucleusEnhancer.getFileMetadataForInput(DataNucleusEnhancer.java:793)
at org.datanucleus.enhancer.DataNucleusEnhancer.enhance(DataNucleusEnhancer.java:525)
at org.datanucleus.enhancer.DataNucleusEnhancer.main(DataNucleusEnhancer.java:1258)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.datanucleus.maven.AbstractDataNucleusMojo.executeInJvm(AbstractDataNucleusMojo.java:333)
at org.datanucleus.maven.AbstractEnhancerMojo.enhance(AbstractEnhancerMojo.java:249)
at org.datanucleus.maven.AbstractEnhancerMojo.executeDataNucleusTool(AbstractEnhancerMojo.java:72)
at org.datanucleus.maven.AbstractDataNucleusMojo.execute(AbstractDataNucleusMojo.java:126)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
POM.xml :-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.co.dsp</groupId>
<artifactId>dsp.dsp-iwork</artifactId>
<version>0.6.0-SNAPSHOT</version>
<name>DSP-iWork</name>
<packaging>jar</packaging>
<properties>
<maven.test.skip>true</maven.test.skip>
<dsp.version>0.6.0-SNAPSHOT</dsp.version>
<equinox.ver>3.7.0.v20110613</equinox.ver>
<spring.maven.artifact.version>3.0.5.RELEASE</spring.maven.artifact.version>
<slf4j.version>1.6.1</slf4j.version>
<spring.osgi.version>1.2.1</spring.osgi.version>
</properties>
<dependencies>
<dependency><groupId>com.co.dsp</groupId>
<artifactId>dsp.kernel</artifactId>
<version>${dsp.version}</version>
</dependency>
<dependency><groupId>org.datanucleus</groupId>
<artifactId>datanucleus-core</artifactId>
<version>3.1.1</version>
<scope>runtime</scope>
</dependency><dependency><groupId>com.co.dsp</groupId>
<artifactId>dsp.dsi.das.dbconnection</artifactId>
<version>${dsp.version}</version>
</dependency><dependency>
<groupId>com.co.dsp</groupId>
<artifactId>dsp.dsi.das.core</artifactId>
<version>${dsp.version}</version>
</dependency>
<dependency><groupId>com.co.dsp</groupId>
<artifactId>dsp.dsi.das.api</artifactId>
<version>${dsp.version}</version>
</dependency>
<dependency><groupId>com.co.dsp</groupId>
<artifactId>dsp.dsi.dups.core</artifactId>
<version>${dsp.version}</version>
</dependency><dependency>
<groupId>com.co.dsp</groupId>
<artifactId>dsp.dsi.dups.api</artifactId>
<version>${dsp.version}</version>
</dependency><dependency>
<groupId>com.co.dsp</groupId>
<artifactId>dsp.dsi.scheduler</artifactId>
<version>${dsp.version}</version>
</dependency><dependency>
<groupId>com.co.dsp</groupId>
<artifactId>das-maven-plugin</artifactId>
<version>${dsp.version}</version>
</dependency>
</dependencies>
<repositories>
<repository><id>nexus</id>
<url>http://sjc1ssadsp01.crd.co.com:8081/nexus/content/groups/public</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>nexus</id>
<url>http://sjc1ssadsp01.crd.co.com:8081/nexus/content/groups/public</url>
</pluginRepository>
<pluginRepository>
<id>maven-repo</id>
<name>maven repo</name>
<url>http://repo.maven.apache.org/maven2/</url>
</pluginRepository>
<pluginRepository>
<id>com.springsource.repository.bundles.milestone</id>
<name> SpringSource Enterprise Bundle Repository - SpringSource Milestone
Releases</name>
<url>http://repository.springsource.com/maven/bundles/milestone</url>
</pluginRepository>
</pluginRepositories>
<build><plugins><plugin>
<groupId>org.datanucleus</groupId>
<artifactId>maven-datanucleus-plugin</artifactId>
<version>3.0.1</version>
<configuration>
<fork>false</fork>
<log4jConfiguration>${basedir}/log4j.properties</log4jConfiguration>
<verbose>true</verbose>
<enhancerName>ASM</enhancerName>
<persistenceUnitName>WorkPersistenceUnit</persistenceUnitName>
</configuration><executions><execution>
<phase>compile</phase><goals><goal>enhance</goal></goals></execution></executions>
</plugin><plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>2.3.7</version>
<extensions>false</extensions><executions> <execution>
<id>bundle</id> <phase>package</phase><goals>
<goal>bundle</goal></goals></execution></executions>
<configuration>
<instructions>
<Bundle-SymbolicName>${project.artifactId}</Bundle-SymbolicName>
<Bundle-Name>DSP iWork Integration</Bundle-Name>
<Bundle-Version>${dsp.version}</Bundle-Version>
<Bundle-Classpath>.</Bundle-Classpath>
<Import-Package>javax.persistence,com.co.dsp.dsi.scheduler,org.apache.log4j,
com.co.dsp.kernel.spi;version="0.3",com.co.dsp.kernel.util;version="0.3",
com.co.dsp.dsi.dups.api;version="0.3",com.co.dsp.dsi.dups.constants;version="0.3",
com.co.dsp.dsi.dups.exception,com.co.dsp.dsi.config,org.springframework.beans.factory,
org.springframework.osgi.context,org.springframework.osgi.util,
org.osgi.framework;version="1.5",com.co.dsp.dsi.das.api,javax.jdo,
javax.jdo.identity, javax.jdo.spi</Import-Package></instructions></configuration>
</plugin></plugins>
</build>
</project>
updating partial message after executing: mvn clean process-classes -X
[DEBUG]Writing resolution tracking file C:\Users\502128830\.m2
\repository\asm\asm\resolver-status.properties
[DEBUG]Could not find metadata asm:asm/maven-metadata.xml in
com.springsource.repository.bundles.milestone
(http://repository.springsource.com/maven/bundles/milestone)
[DEBUG]Could not find metadata asm:asm/maven-metadata.xml in DN_M2_Repo
(http://www.datanucleus.org/downloads/maven2/)
[WARNING]Could not transfer metadata asm:asm/maven-metadata.xml from/to
local.repository (file:../../local.repository/trunk): No connector available to access
repository local.repository (file:../../local.repository/trunk) of type leg
acy using the available factories WagonRepositoryConnectorFactory
org.sonatype.aether.transfer.MetadataTransferException: Could not transfer metadata
asm:asm/maven-metadata.xml from/to local.repository
(file:../../local.repository/trunk): No connector available to access repository
local.repository (
file:../../local.repository/trunk) of type legacy using the available factories
WagonRepositoryConnectorFactory
at org.sonatype.aether.impl.internal.DefaultMetadataResolver$ResolveTask.run
(DefaultMetadataResolver.java:588)
at org.sonatype.aether.util.concurrency.RunnableErrorForwarder$1.run
(RunnableErrorForwarder.java:60)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask
(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run
(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Caused by: org.sonatype.aether.transfer.NoRepositoryConnectorException: No connector
available to access repository local.repository (file:../../local.repository/trunk)
of type legacy using the available factories WagonRepositoryConnec
torFactory
at
org.sonatype.aether.impl.internal.DefaultRemoteRepositoryManager.getRepositoryConnector
(DefaultRemoteRepositoryManager.java:400)
at org.sonatype.aether.impl.internal.DefaultMetadataResolver$ResolveTask.run
(DefaultMetadataResolver.java:559)
... 4 more
[DEBUG]Could not find metadata asm:asm/maven-metadata.xml in local (C:\Users\502128830
\.m2\repository)
[INFO]--- maven-datanucleus-plugin:3.0.1:enhance (default) # dsp.dsp-iwork ---
[DEBUG]Could not find metadata org.datanucleus:datanucleus-core/maven-metadata.xml in
local (C:\Users\502128830\.m2\repository)
[WARNING]Could not transfer metadata asm:asm/maven-metadata.xml from/to
local.repository (file:../../local.repository/trunk): No connector available to access
repository
local.repository (file:../../local.repository/trunk) of type leg
acy using the available factories WagonRepositoryConnectorFactory
org.sonatype.aether.transfer.MetadataTransferException: Could not transfer metadata
asm:asm/maven-metadata.xml from/to local.repository
(file:../../local.repository/trunk): No connector available to access repository
local.repository (
file:../../local.repository/trunk) of type legacy using the available factories
WagonRepositoryConnectorFactory
at org.sonatype.aether.impl.internal.DefaultMetadataResolver$ResolveTask.run
(DefaultMetadataResolver.java:588)
at org.sonatype.aether.util.concurrency.RunnableErrorForwarder$1.run
(RunnableErrorForwarder.java:60)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run
(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Caused by: org.sonatype.aether.transfer.NoRepositoryConnectorException: No connector
available to access repository local.repository (file:../../local.repository/trunk)
of type legacy using the available factories WagonRepositoryConnec
torFactory
at
org.sonatype.aether.impl.internal.DefaultRemoteRepositoryManager.getRepositoryConnector
(DefaultRemoteRepositoryManager.java:400)
at org.sonatype.aether.impl.internal.DefaultMetadataResolver$ResolveTask.run
(DefaultMetadataResolver.java:559)
... 4 more
Could you post your pom ?
You want to work on bytecode with DataNucleus Enhancer.
Your plugin is bound to compile phase, which is wrong, as it works on compiled classes.
You should bind it to process-classes instead.

Resources