Why can't I run the example from storm-starter using this command? - maven

I've have had no experience using Storm or Maven before, and I'm working on my starter project. When I compile the starter project uploaded on the git website using the command given there i.e. this:
mvn compile exec:java -Dexec.classpathScope=compile -Dexec.mainClass=storm.starter.ExclamationTopology
I can run the Exclamation topology class, but when I use this command:
java -cp ./target/storm-starter-0.0.1-SNAPSHOT-jar-with-dependencies.jar storm.starter.ExclamationTopology
I can't run it.
By the way, I got the second command from the maven tutorial on apache's site
Could someone point out what am I doing wrong here?
PS: This is the error http://pastebin.com/A1PQbB3r

You are hitting the java.lang.NoClassDefFoundError since the storm jars are not in your classpath. For your second command, put the storm jar and the storm/lib in your classpath and it should work as expected.

Your pom probably has the scope for the storm dependency as "provided" which means that it will be in the runtime classpath, but not in the jar-with-dependencies. Try changing the scope to "compile"

The scope for the Storm dependency should be different depending on whether your are running in local mode or cluster.
For local mode you need to set the scope to "compile" or leave the tag empty as scope defaults to "compile".
In order to submit your topology to a cluster you need to set the scope to "provided", otherwise the Storm jar will be packaged inside your topology jar and when deploying to the cluster there will be 2 Storm jars in the classpath: the one inside your topology and the one inside the Storm installation directory.

Related

Should I use spark-submit if using spring boot

What is the purpose of spark submit? From what I can see it is just adding properties and jars to the classpath.
If I am using spring boot can I avoid using spark-submit, and just package a fat jar with all the properties I want spark.master etc...
Can ppl see any downside to doing this?
recently I met same case - and also try to stick to spring boot exec jar which unfortunately failed finally, but I was close to end. the state when I gave up was - spring boot jar built without spark/hadoop libs included, and i was running it on a cluster with -Dloader.path='spark/hadoop libs list extracted from SPARK_HOME and HADOOP_HOME on cluster'. I ended up using 2d option - build fat jar with shaded plugin and running it as usual jar by spark submit which seems to be a bit strange solution but still works ok

mvn spring-boot:run vs java -jar

I know it may sound silly question but I am unable to understand the difference between mvn spring-boot:run and java -jar (.jar file generated with mvn install)
I have a spring boot application with jsp pages in /src/main/resources/META-INF/resources/WEB-INF/. If I use mvn spring-boot:run these pages are served. But If I use java -jar these pages are not found by application.
The application that I am working on is at https://github.com/ArslanAnjum/angularSpringApi
UPDATE:
It works with spring boot 1.4.2.RELEASE while I intend to use the latest version i.e., 1.5.8.RELEASE.
UPDATE:
Well I solved the problem by putting jsps in src/main/webapp/WEB-INF/views/ and changing packaging type to war and then running this war using java -jar target/myapp.war and its working fine now.
Short answer: spring-boot:run is a java -jar command on steroïd running as part of your Maven build, ensuring all required parameters are passed to your app (such as resources). spring-boot:run will also ensure that your project is compiled by executing test-compile lifecycle goals prior to running your app.
Long answer:
When you run java -jar, you launch a new JVM instance with all the parameters you passed to this JVM. For example, using the Spring doc example
java -Xdebug -Xrunjdwp:server=y, \
transport=dt_socket, address=8000, suspend=
-jar target/myproject-0.0.1-SNAPSHOT.jar
You will launch a brand new JVM with the given parameters. You need to make sure to include everything needed, such as classpath elements, application parameters, JVM options, etc. on the command line.
When you run mvn spring-boot:run, you launch a Maven build that will:
Run the test-compile lifecycle goals, by default it will be resources:resources, compiler:compile, resources:testResources, compiler:testCompile goals of the Maven Resources and Compiler plugin.
Launch your application with a bunch of parameters that will depend on the
Spring Boot Maven Plugin configuration you defined in your project (your pom.xml, parents and settings, command line, etc.). This includes among other things:
A lot of classpath elements: your target/classes folder which may contain resources and libraries required by your app, your Maven dependencies, etc.
Whether to fork your JVM or not (whether to create a brand new JVM to run your app or re-use the JVM of the Maven build), see fork and agent parameter of the plugin
As per:
I have a spring boot application with jsp pages in
/src/main/resources/META-INF/resources/WEB-INF/. If I use mvn
spring-boot:run these pages are served. But If I use java -jar these
pages are not found by application.
It's because the mvn spring:boot command will make sure your target/classes folder is present in the Classpath when your app is running. After compilation, this folder will contain target/classes/META-INF/resources/WEB-INF among other things. Your app will then be able to find META-INF/resources/WEB-INF and load them when asked. When you ran java -jar command, this folder was probably not on the classpath, your app was then not able to find your resources. (these resources were copied from the src/main/resources folder during the resources:resources goal)
To have a similar result with your java -jar command, you must include your resources on the classpath such as javar -jar myapp.jar -cp $CLASSPATH;/path/to/my/project/target/classes/
Have you tried creating a jar file using mvn package instead of mvn install when you are running jar file using java -jar? package will create a jar/war as per your POM file whereas install will install generated jar file to the local repository for other dependencies if present.

Setting spark classpaths on EC2: spark.driver.extraClassPath and spark.executor.extraClassPath

Reducing size of application jar by providing spark- classPath for maven dependencies:
My cluster is having 3 ec2 instances on which hadoop and spark is running.If I build jar with maven dependencies, it becomes too large(around 100 MB) which I want to avoid this as Jar is getting replicating on all nodes ,each time I run the job.
To avoid that I have build a maven package as "maven package".For dependency resolution I have downloaded the all maven dependencies on each node and then only provided above below jar paths:
I have added class paths on each node in the "spark-defaults.conf" as
spark.driver.extraClassPath /home/spark/.m2/repository/com/google/code/gson/gson/2.3.1/gson-2.3.1.jar:/home/spark/.m2/repository/com/datastax/cassandra/cassandra-driver-core/2.1.5/cassandra-driver-core-2.1.5.jar:/home/spark/.m2/repository/com/google/guava/guava/16.0.1/guava-16.0.1.jar:/home/spark/.m2/repository/com/google/collections/google-collections/1.0/google-collections-1.0.jar:/home/spark/.m2/repository/com/datastax/spark/spark-cassandra-connector-java_2.10/1.2.0-rc1/spark-cassandra-connector-java_2.10-1.2.0-rc1.jar:/home/spark/.m2/repository/com/datastax/spark/spark-cassandra-connector_2.10/1.2.0-rc1/spark-cassandra-connector_2.10-1.2.0-rc1.jar:/home/spark/.m2/repository/org/apache/cassandra/cassandra-thrift/2.1.3/cassandra-thrift-2.1.3.jar:/home/spark/.m2/repository/org/joda/joda-convert/1.2/joda-convert-1.2.jar
It has worked,locally on single node.
Still i am getting this error.Any help will be appreciated.
Finally, I was able to solve the problem. I have created application jar using "mvn package" instead of "mvn clean compile assembly:single ",so that it will not download the maven dependencies while creating jar(But need to provide these jar/dependencies run-time) which resulted in small size Jar(as there is only reference of dependencies).
Then, I have added below two parameters in spark-defaults.conf on each node as:
spark.driver.extraClassPath /home/spark/.m2/repository/com/datastax/cassandra/cassandra-driver-core/2.1.7/cassandra-driver-core-2.1.7.jar:/home/spark/.m2/repository/com/googlecode/json-simple/json-simple/1.1/json-simple-1.1.jar:/home/spark/.m2/repository/com/google/code/gson/gson/2.3.1/gson-2.3.1.jar:/home/spark/.m2/repository/com/google/guava/guava/16.0.1/guava-16.0.1.jar
spark.executor.extraClassPath /home/spark/.m2/repository/com/datastax/cassandra/cassandra-driver-core/2.1.7/cassandra-driver-core-2.1.7.jar:/home/spark/.m2/repository/com/googlecode/json-simple/json-simple/1.1/json-simple-1.1.jar:/home/spark/.m2/repository/com/google/code/gson/gson/2.3.1/gson-2.3.1.jar:/home/spark/.m2/repository/com/google/guava/guava/16.0.1/guava-16.0.1.jar
So question arises that,how application JAR will get the maven dependencies(required jar's) run-time?
For that I have downloaded all required dependencies on each node using mvn clean compile assembly:single in advance.
You don't need to put all jars files .Just Put your application jar file .
If you get again error than put all jar files which are needed .
You have to put jars file by setJars() methods .

dependency issues with app while deploying in tomcat-server

i am using hbase 0.94.7 and hadoop 1.0.4 and tomcat 7
i wrote a small res-based application which performs crud operations on hbase.
earlier i used to run the app using maven tomcat plugin.
now i am trying to deploy the war in tomcat-server.
since hadoop and hbase jars already contain org.mortbay.jetty jsp-api and servlet-api jars of older verisons,
i am getting Abstract Method Exceptions
here's the exception log
so then i made a exclusion of org.mortbay.jetty from both hadoop and hbase dependencies in pom.xml. but it started showing more and more such kind of issues like jasper.
so then i added scope provided to hadoop and hbase dependencies.
now tomcat is unable to find the hadoop and hbase jars.
can someone help me in fixing this dependecy issues.
Thanks.
Do one thing,
- Right click on project
- go to property,
- type java build path,
- go to third tab of library,
- Removed dependency of lib and maven,
- Clean build your project.
might be solve your problem.

What is wrong with my neo4j test setup? EmbeddedNeo4j.java, neo4j, maven

I started a project with maven using the "quickstart" archetype. I then changed my POM to include neo4j:
https://github.com/ENCE688R/msrcs/blob/master/pom.xml
I added:
https://github.com/neo4j/neo4j/blob/master/community/embedded-examples/src/main/java/org/neo4j/examples/EmbeddedNeo4j.java
and ran
mvn package
This works with no errors, but
java -cp target/msrcs-1.0-SNAPSHOT.jar org.neo4j.examples.EmbeddedNeo4j
Returns the Error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/neo4j/graphdb/RelationshipType
What am I missing? At this point I simply need to test that I can include and use neo4j.
use
mvn exec:java -Dexec.mainClass=org.neo4j.examples.EmbeddedNeo4j
there is also mvn dependency:copy that copies all dependencies to target/dependencies
and there is the mvn appassembler plugin that allows you to generate startup shell scripts that include all your dependencies as a classpath.
And last but not least there is the maven assembly plugin mvn assembly:single which generates a single jar file that you can run java -jar my-jar-file.jar
You need to add the Neo4j dependencies to your classpath as well. At the moment you're only adding the source jar you created. If you look at this POM you'll see that Neo4J examples require many other dependencies.
Find the libs directory where the dependencies have been downloaded (this may be in your local .m2 maven repo) and add these jars to your classpath. You do not need to add each jar one-by-one as you can simply add a directory with wildcards - ex:
Windows:
java -cp "target/msrcs-1.0-SNAPSHOT.jar;lib/*" org.neo4j.examples.EmbeddedNeo4j
Mac/Unix:
java -cp "target/msrcs-1.0-SNAPSHOT.jar:lib/*" org.neo4j.examples.EmbeddedNeo4j
I've started to work on some maven archetypes which could be a good starting point as well.
For java Neo4j projects, use neo4j-archetype-quickstart.
For Spring Data Neo4j projects, use sdn-archetype-quickstart.

Resources