java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.reloadExistingConfigurations()V - maven

It looks like I am again stuck on the running a packaged spark app jar using spark submit. Following is my pom file:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<parent>
<artifactId>oneview-forecaster</artifactId>
<groupId>com.dataxu.oneview.forecast</groupId>
<version>1.0.0-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>forecaster</artifactId>
<dependencies>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.module</groupId>
<artifactId>jackson-module-scala_${scala.binary.version}</artifactId>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
<!--<scope>provided</scope>-->
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-hive -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.2.0</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-aws</artifactId>
<version>2.8.3</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.10.60</version>
</dependency>
<!-- https://mvnrepository.com/artifact/joda-time/joda-time -->
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.9.9</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.8.0</version>
<!--<scope>provided</scope>-->
</dependency>
</dependencies>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>${scala-maven-plugin.version}</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>com.dataxu.oneview.forecaster.App</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
Following is a simple snippet of code which fetches data from s3 location and prints it:
def getS3Data(path: String): Map[String, Any] = {
println("spark session start.........")
val spark = getSparkSession()
val configTxt = spark.sparkContext.textFile(path)
.collect().reduce(_ + _)
val mapper = new ObjectMapper
mapper.registerModule(DefaultScalaModule)
mapper.readValue(configTxt, classOf[Map[String, String]])
}
When I run it from intellij, everything works fine. the log is clear and looks good. However, when I package it using mvn package and try to run it using spark submit, I end up getting the following error at the .collect.reduce(_ + _). Following is the error I encounter:
"main" java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.reloadExistingConfigurations()V
at org.apache.hadoop.fs.s3a.S3AFileSystem.addDeprecatedKeys(S3AFileSystem.java:181)
at org.apache.hadoop.fs.s3a.S3AFileSystem.<clinit>(S3AFileSystem.java:185)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
...
I am not understanding which dependency was not packaged or what might be the issue as I did set the versions correctly expecting the hadoop aws should have all of them.
Any help will be appreciated.

The dependencies between hadoop and AWS JDK are very sensitive, and you should stick to using the correct versions that your hadoop dependency version was built with.
The first problem you need to solve is pick one version of Hadoop. I see you're mixing versions 2.8.3 and 2.8.0.
When I look at the dependency tree for org.apache.hadoop:hadoop-aws:2.8.0, I see that it is built against version 1.10.6 of the AWS SDK (same for hadoop-aws:2.8.3).
This is probably what's causing mismatches (you're mixing incompatible versions). So:
Choose the version of hadoop you want to use
Include hadoop-aws with the version compatible with your hadoop
Remove other dependencies, or only include them with versions matching the one compatible with your hadoop version.

In case anybody else is still stumbling on this error... it took me a while to find out, but check if your project has a dependency (direct or transitive) on the package org.apache.avro/avro-tools.
It was brought into my code by a transitive dependency.
Its problem is that it ships with a copy of org.apache.hadoop.conf.Configuration
that is much older than all current versions of hadoop, so it may end up being the one picked up in the classpath.
In my scala project, I just had to exclude it with
ExclusionRule("org.apache.avro","avro-tools")
and the error (finally!) disappear.
I am sure that the avro-tools coders had some good reason to include a copy of a file that belongs to another package (hadoop-common), I was really surprised to find it there and made me waste an entire day.

In my case, I was running a local Spark installation on a Cloudera edge node and was hitting this conflict (even though I made sure to download Spark with the correct hadoop binaries precompiled). I just went into my Spark home and moved the hadoop-common jar so it wouldn't be loaded:
mv ~/spark-2.4.4-bin-hadoop2.6/jars/hadoop-common-2.6.5.jar ~/spark-2.4.4-bin-hadoop2.6/jars/hadoop-common-2.6.5.jar.XXXXXX
After that, it ran... in local mode anyway.

Related

What do we meant by "Unresolved requirement: Import-Package: com.google.common.collect_ [Sanitized]" in liferay 7.2

I am creating a hook in liferay 7.2 but unfortunately when I deploy it.I come across this error. I had tried increasing version of "com.google.collections" dependency and also tried adding guauva
a dependency but nothing seems to resolve this error.
My dependencies in Pom.xml is as such:
<dependencies>
<dependency>
<groupId>com.liferay.portal</groupId>
<artifactId>com.liferay.portal.kernel</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.osgi</groupId>
<artifactId>org.osgi.service.component.annotations</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.google.collections</groupId>
<artifactId>google-collections</artifactId>
<version>1.0-rc2</version>
</dependency>
<dependency>
<groupId>org.osgi</groupId>
<artifactId>osgi.cmpn</artifactId>
<version>6.0.0</version>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.1.2</version>
<configuration>
<archive>
<manifestFile>${project.build.outputDirectory}/META-INF/MANIFEST.MF</manifestFile>
</archive>
</configuration>
</plugin>
<plugin>
<groupId>biz.aQute.bnd</groupId>
<artifactId>bnd-maven-plugin</artifactId>
<version>4.3.0</version>
<executions>
<execution>
<goals>
<goal>bnd-process</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>biz.aQute.bnd</groupId>
<artifactId>biz.aQute.bndlib</artifactId>
<version>4.3.0</version>
</dependency>
<dependency>
<groupId>com.liferay</groupId>
<artifactId>com.liferay.ant.bnd</artifactId>
<version>3.2.6</version>
</dependency>
</dependencies>
Error :
org.osgi.framework.BundleException: Could not resolve module: com.allen.portal.hook [1272]_ Unresolved requirement: Import-Package: com.google.common.collect_ [Sanitized]
at org.eclipse.osgi.container.Module.start(Module.java:444)
at org.eclipse.osgi.internal.framework.EquinoxBundle.start(EquinoxBundle.java:428)
at com.liferay.portal.file.install.internal.DirectoryWatcher._startBundle(DirectoryWatcher.java:1106)
at com.liferay.portal.file.install.internal.DirectoryWatcher._startBundles(DirectoryWatcher.java:1139)
at com.liferay.portal.file.install.internal.DirectoryWatcher._process(DirectoryWatcher.java:1001)
at com.liferay.portal.file.install.internal.DirectoryWatcher.run(DirectoryWatcher.java:313)
If you have any ways to resolve this error, please help me out
Unrelated: You're using an rc2 version released in October 2009, when a release was made in December 2009? Seriously?
It looks like you're building an OSGi module, which compiles fine (because you provide the dependency). However, that does not mean that the google collections code ends up in your jar as well. The runtime expects to find it though - and as Google collections is not an OSGi bundle itself, you'll have several choices:
repackage it as OSGi bundle (and deploy it to the runtime) (or find someone who did it already)
repackage it within your own bundle
use a different implementation. Chances are that collections utility code from 2009 has found its way into more current implementations and is no longer necessary.
In short: In one way or another, you'll need to make your dependencies available at runtime. Either by fattening your own bundle (but be careful: You can't pass those collections around to other bundles if they bring their own implementation) or by relying on the implementation being available to the runtime.
The third alternative is to switch to an implementation where it's easier to make it available at runtime, preferably as OSGi bundle.

Manually creating a deployable JAR for Liferay

I created a liferay workspace in gradle format and it basically only contains a theme and a TemplateContextContributor-module.
Now I want to build a maven "wrapper" around both artifacts to make them compatible with some other maven-processes/-plugins while keeping the original gradle structure. I dont want to use the liferay-maven-plugin or maven-tools to build those artifacts, because it seems to behave differently from the gradle/gulp toolset when it comes to compiling scss for example.
So I created some POMs from scratch for
Theme
TemplateContextContributor-Module
First off I will take about the mechanism for the theme, which is already working:
That wrapper uses the maven-war-plugin to bundle the contents of the build/-folder, where the previously built gradle artifact resides, into a WAR-file that can be deployed by Liferay without problems.
theme pom.xml:
<properties>
<src.dir>src</src.dir>
<com.liferay.portal.tools.theme.builder.outputDir>build</com.liferay.portal.tools.theme.builder.outputDir>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
[...]
<plugin>
<artifactId>maven-war-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<webResources>
<resource>
<directory>${com.liferay.portal.tools.theme.builder.outputDir}</directory>
<excludes>
<exclude>**/*.sass-cache/</exclude>
</excludes>
</resource>
</webResources>
</configuration>
</plugin>
However, I am having difficulties creating a OSGI-Compatible JAR-File for the module contents. It seems that only the META-INF/MANIFEST.MF does not contain the right information and I seemingly cannot generate it in a way that Liferay (or OSGI) understands.
this is the module pom.xml dependencies and plugins that I tried:
<dependency>
<groupId>org.apache.felix</groupId>
<artifactId>org.apache.felix.scr.ds-annotations</artifactId>
<version>1.2.10</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.liferay</groupId>
<artifactId>com.liferay.gradle.plugins</artifactId>
<version>3.9.9</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.liferay.portal</groupId>
<artifactId>com.liferay.portal.kernel</artifactId>
<version>2.0.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.osgi</groupId>
<artifactId>org.osgi.service.component.annotations</artifactId>
<version>1.3.0</version>
<scope>provided</scope>
</dependency>
[...]
<plugin>
<groupId>biz.aQute.bnd</groupId>
<artifactId>bnd-maven-plugin</artifactId>
<version>3.3.0</version>
<executions>
<execution>
<goals>
<goal>bnd-process</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>biz.aQute.bnd</groupId>
<artifactId>biz.aQute.bndlib</artifactId>
<version>3.2.0</version>
</dependency>
<dependency>
<groupId>com.liferay</groupId>
<artifactId>com.liferay.ant.bnd</artifactId>
<version>2.0.48</version>
</dependency>
</dependencies>
</plugin>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-scr-plugin</artifactId>
<version>1.25.0</version>
<executions>
<execution>
<id>generate-scr-scrdescriptor</id>
<goals>
<goal>scr</goal>
</goals>
</execution>
</executions>
</plugin>
I was able to create a JAR using the above but its' META-INF/MANIFEST.MF is not identical to the one produced by the gradle build:
I guess that's why Liferay does not deploy it. The log says "processing module xxx ....", but that never ends and the module does not work in Liferay.
These are the plugins I have tried in different combinations so far:
maven-build-plugin
maven-scr-plugin
maven-jar-plugin
maven-war-plugin
maven-compiler-plugin
Any help in creating a liferay-deployable module JAR would be great.
I'm not sure why you're manually building a maven wrapper for the Template Context Contributor. The Liferay (blade) samples are available for Liferay-workspace, pure Gradle as well as for Maven. I'd just go with the standard and not worry about re-inventing the wheel.
To make this answer self-contained: The current pom.xml listed in the Template Context Contributor plugin is:
<project
xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
>
<modelVersion>4.0.0</modelVersion>
<artifactId>template-context-contributor</artifactId>
<version>1.0.0</version>
<packaging>jar</packaging>
<parent>
<groupId>blade</groupId>
<artifactId>parent.bnd.bundle.plugin</artifactId>
<version>1.0.0</version>
<relativePath>../../parent.bnd.bundle.plugin</relativePath>
</parent>
<dependencies>
<dependency>
<groupId>com.liferay.portal</groupId>
<artifactId>com.liferay.portal.kernel</artifactId>
<version>2.0.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>javax.portlet</groupId>
<artifactId>portlet-api</artifactId>
<version>2.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
<version>3.0.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.osgi</groupId>
<artifactId>org.osgi.service.component.annotations</artifactId>
<version>1.3.0</version>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<finalName>com.liferay.blade.template.context.contributor-${project.version}</finalName>
</build>
</project>

Apache Spark dependency issue

I'm trying to run my spark application in a Hadoop cluster.
The spark version running in the cluster is 1.3.1. I'm getting the error as posted below while packaging and running my spark application in a cluster. I looked at the other posts as well, seems like I'm messing up with the library dependencies, but couldn't figure out what!
Here are the other information that might be helpful for you guys to help me out:
hadoop -version:
Hadoop 2.7.1.2.3.0.0-2557
Subversion git#github.com:hortonworks/hadoop.git -r 9f17d40a0f2046d217b2bff90ad6e2fc7e41f5e1
Compiled by jenkins on 2015-07-14T13:08Z
Compiled with protoc 2.5.0
From source with checksum 54f9bbb4492f92975e84e390599b881d
This command was run using /usr/hdp/2.3.0.0-2557/hadoop/lib/hadoop-common-2.7.1.2.3.0.0-2557.jar
The error stack:
java.lang.NoSuchMethodError: org.apache.spark.sql.hive.HiveContext: method <init>(Lorg/apache/spark/api/java/JavaSparkContext;)V not found
at com.cyber.app.cyberspark_app.main.Main.main(Main.java:163)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:577)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:174)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:197)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
My pom.xml looks like this:
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>path.to.my.main.Main</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id> <!-- this is used for inheritance merges -->
<phase>package</phase> <!-- bind to the packaging phase -->
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>1.6.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>1.6.1</version>
<scope>provided</scope>
</dependency>
</dependencies>
I'm using "mvn package" to package my jar.
EDIT:
I tried changing all the versions to 1.3.1. If I do this change, I
need to change my application as I'm using the features that were
available after 1.3.1.
But if I put all 1.6.1 compiled under
Scala_2.10, I get the same error.
Please let me know if I need to provide any additional information. Any help will be greatly appreciated.
Thank you.
It can be binary compatibility issues.
First, make sure that all your Spark dependencies are on Spark 1.3.1. I see that you have Spark SQL to be on 1.6.1.
Second, you are using Spark compiled on Scala 2.11. The typical distribution of Spark is compiled only on 2.10. Typically, if you want the 2.11 version you need to compile spark yourself.
If you are not sure if the Spark running on your cluster is compiled with Scala I would change all my dependencies to use "2.10" instead of "2.11" and try again.

OSGi bundle compile error

When I build my bundle, maven throws the exception:
[ERROR] Bundle com.onboard:com.onboard.service.security:bundle:3.0.0-SNAPSHOT : Exporting packages that are not on the Bundle-Classpath[Jar:dot]: [about_files, XXX]
[ERROR] Error(s) found in bundle configuration
I use maven-bundle-pluginto build my code:
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>2.3.7</version>
<extensions>true</extensions>
<configuration>
<manifestLocation>src/main/resources/META-INF</manifestLocation>
<instructions>
<Bundle-SymbolicName>${bundle.symbolicName}</Bundle-SymbolicName>
<Bundle-Version>${project.version}</Bundle-Version>
<Export-Package>${bundle.Export-Package};version="${project.version}"</Export-Package>
<Private-Package>!${bundle.Export-Package};${bundle.Export-Package}.internal.*</Private-Package>
<_include>osgi.bnd</_include>
</instructions>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
And my dependencies are:
<dependency>
<groupId>org.eclipse.jetty.orbit</groupId>
<artifactId>javax.servlet</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>org.springframework.security.web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>org.springframework.web.servlet</artifactId>
</dependency>
<dependency>
<groupId>org.elevenframework</groupId>
<artifactId>org.elevenframework.web.api</artifactId>
</dependency>
<dependency>
<groupId>com.onboard</groupId>
<artifactId>com.onboard.domain.model</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>com.onboard</groupId>
<artifactId>com.onboard.service.common</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>com.onboard</groupId>
<artifactId>com.onboard.service.web</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>com.onboard</groupId>
<artifactId>com.onboard.service.account</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>com.onboard</groupId>
<artifactId>com.onboard.service.collaboration</artifactId>
<version>${project.version}</version>
</dependency>
The application running good a few days ago. I think it is the modify of dependency com.onboard.XXX lead to this. But I do not what it is. What shall I do?
This appears to be the same question that you asked here with much more information
Your maven-bundle-plugin configuration is exporting the packages from the maven property bundle.Export-Package
<Export-Package>${bundle.Export-Package};version="${project.version}"</Export-Package>
A bundle should never export a package that it does not contain. In this case your bundle is trying to export about_files and XXX. Both of these things look very wrong.
You haven't included the full POM, nor have you included the osgi.bnd file referenced in your maven-bundle-plugin configuration, but it looks as though this misconfiguration is the source of the error.
In my view trying to automate package exports and private packages with properties is usually a mistake. In a maven module you almost invariably want to private package all of the classes from the src/main/java folder, include the files from src/main/resources and export specific named packages. Incidentally this is how the bnd-maven-plugin works.

How to deploy in a gwt-maven project

first of all, I'm not sure which folders and files i have to deploy in a gwt-maven project
I've got:
.gwt
.settings
bin
src/main/java
target
war
pom.xml
I'm pretty sure, I've to deploy the pom.xml somehow and the target folder. But my target folder doesn't contain a pom.xml which I need for deploying on a jetty server
Second:
I've installed maven on my webserver, but apart from embedding the jetty-maven-plugin in the pom.xml (by
org.eclipse.jetty
jetty-maven-plugin
)
I have absolutely no clue how to get this project running on a jetty server.
http://maven.apache.org/xsd/maven-4.0.0.xsd">
4.0.0
SiedlerVonCatanC
SiedlerVonCatanC
war
src/main/java
src/main/java
*/.java
maven-compiler-plugin
3.1
1.7
1.7
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>gwt-maven-plugin</artifactId>
<version>2.5.1</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test</goal>
</goals>
<configuration>
<module>main.java.de.swp.catan.SiedlerVonCatanC</module>
<runTarget>SiedlerVonCatanC.html</runTarget>
<hostedWebapp>${webappDirectory}</hostedWebapp>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
<dependencies>
<!-- GWT -->
<dependency>
<groupId>com.google.gwt</groupId>
<artifactId>gwt-servlet</artifactId>
<version>2.5.1</version>
</dependency>
<dependency>
<groupId>com.google.gwt</groupId>
<artifactId>gwt-user</artifactId>
<version>2.5.1</version>
</dependency>
<!-- SmartGWT -->
<dependency>
<groupId>com.smartgwt</groupId>
<artifactId>smartgwt</artifactId>
<version>3.0</version>
</dependency>
<!-- Event Service -->
<dependency>
<groupId>de.novanic.gwteventservice</groupId>
<artifactId>gwteventservice</artifactId>
<version>1.2.0</version>
</dependency>
<!-- Java-Mail -->
<dependency>
<groupId>javax.mail</groupId>
<artifactId>mail</artifactId>
<version>1.4</version>
</dependency>
<!-- Apache Commons -->
<dependency>
<groupId>commons-lang</groupId>
<artifactId>commons-lang</artifactId>
<version>2.3</version>
</dependency>
<!-- htmlunit (wird im Projekt irgendwo genutzt) -->
<!-- <dependency> <groupId>net.sourceforge.htmlunit</groupId> <artifactId>htmlunit</artifactId>
<version>2.4</version> </dependency> -->
<!-- Guice -->
<dependency>
<groupId>com.google.inject</groupId>
<artifactId>guice</artifactId>
<version>3.0</version>
</dependency>
<!-- Connector for JDBC -->
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.25</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>com.smartgwt</id>
<url>http://www.smartclient.com/maven2</url>
</repository>
</repositories>
<version>0.2</version>
have the maven-jetty-plugin included in your pom , under plugins and then you can run it as
mvn jetty:run
thanks
If you want to use maven plugin to run embedded jetty for development then easy way using command-line (which I recommend for start) is:
Edit: Please note this is not related to the existing project.Its process from scratch. Just run these command from a new directory where you would like a new gwt project to be created.
Dowload gwt maven plugin http://mojo.codehaus.org/gwt-maven-plugin/. In command line list you can see it as org.codehaus.mojo:gwt-maven-plugin (Maven plugin for the Google Web Toolkit.) I see that you have this plugin in POM, but if you are at loss here the simple steps from command-line to achive this are:
mvn archetype:generate
enter org.codehaus.mojo:gwt-maven-plugin in search prompt
comfirm the resutl with enter
set-up the maven project configuration as prompted
run the project with mvn gwt:run from the folder where you have pom.xml
These two steps will run a sample project for you which you can further modify and experiment while learning how it works.,
As for directories:
The maven compiles all your stuff in target directory and that is used for deployment or running embedded jetty.

Resources