Acceleo maven generation multi artifact/project - maven

I am trying to construct a maven acceleo generator.
The generator consists of multiple acceleo projects (artifacts in maven), with inter-dependencies.
I am running into a problem with the dependencies between emtl files.
At runtime, I get errors stating that there are compilation errors in the mtl, (there are not).
I am guessing it may be the hrefs within the compiled emtl files.
There is an acceleo maven compile plugin that allows for these to be either:
a) absolute paths (ending up to be via the maven repository)
b) 'platform:/plugin/...' paths.
(a) works if the acceleo projects (maven artifacts) are build on the same machine as the one on which we do the generation, but if the location of the maven repository changes, we have a problem. Hence deploying the maven artifacts ends up being pointless.
(b) doesn't work because running from within maven, 'platform:/plugin/' cannot be resolved.
I have tried to override the 'createURIConverter' method in the AbstractAcceleoGenerator,
then using a URLClassLoader we can decode the 'platform:/plugin/' hrefs and find the correct emtl file.
I can verify that this seems to be working, however I still get the reported
"org.eclipse.acceleo.engine.AcceleoEvaluationException: Unresolved compilation error in generation module"
any suggestions?

The exception was caused by hrefs in the EMTL model resolving to null.
The problem with resolving the hrefs was just a lack of resource factories being registered (the exception informing me of this gets lost in EcoreUtil.resolve which catches the exception with "// Failure to resolve is ignored." - most unhelpful).
So it seems that my approach of "override the ‘createURIConverter’ method, and get it to decode the ‘platform:/plugin/..’ using a URLClassLoader" does work.
Building the URLClassLoader using the same method done in the AcceleoParserMojo.

Related

Jar file not getting added in to a external Libraries in Intellij

I am trying to work on a spring-security project in which i have added the spring security dependency via pom.xml file.But as my maven completed its build successfully,its not getting added in the external library.
Please find my the screenshots below:
pom.xml file:
External Library
Dependencies added after successful maven build :
Tried to create a new project to check whether the above issues persist there as well,but there its working as expected as i am able to get the required java files.So the issue is only relevant to the above project.
I even tried to do the steps mentioned from the below links apart from the maven life cycle steps,but that also did not work out
link

How to fix errors caused by two identical jar with the same name but different versions?

In the project, there are two modules: data and infrastructure.
Data module uses the grpc plug-in provided by Google: grpc-protobuf, which refers to com.google.guava:guava [version:26.0-android].
Infrastructure module uses consul's plug-in: consul-client, which refers to com.google.guava:guava [version:22.0].
And the data module depends on the Infrastructure module.
There is no problem at compile time, but at run time, ConsulCache in consul-client calls the Stopwatch.elapsed() method in com.google.guava:guava, which is no-argument in com.google.guava:guava:22.0 and parametric in com.google.guava:guava:26.0-android. ConsulCache always points to com.google.guava:guava:26.0-android rather than com.goog.guava:guava:22.0, which will report the following exception:
java.lang.NoSuchMethodError: com.google.common.base.Stopwatch.elapsed()Ljava/time/Duration;
at com.orbitz.consul.cache.ConsulCache$1.onComplete(ConsulCache.java:103)
Because it's a maven project and it's all about automatically managing jar packages, it's automatically referenced by third-party plug-ins themselves.
How to fix this exception without modifying their source code?
First of all, you need to decide if you want version 22.0 or version 26.0-android.
Then you add a <dependencyManagment> entry to the poms (or the common parent pom, even better) that manages com.google.guava:guava to the version you want.
The dependencyManagement (https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#Dependency_Management) overwrites all transitively found version numbers of that dependency. So you only get the version you choose.

Correct way to generate feature files for Karaf?

I am trying to create a Karaf assembly using Maven (and NetBeans). I create my bundles using declarative services, but I am having problems creating feature files. Part of my problem is the error messages that OSGi generates. But I have a more general questions.
I have discovered that I can call karaf-maven-plugin in the project that creates my bundles and it generates what appears to be a comprehensively populated feature file based on the dependencies of the bundle. (Method 1)
However, I have read somewhere that creating a feature file with karaf-maven-plugin should normally only be done in a project with feature packaging. If I do this, it seems to me that I have to create the feature file by hand, which is not a lot of fun. (Method 2)
No matter which method I use, I have been unable to successfully generate a Karaf assembly that contains anything other than simple bundles without any dependencies. I am currently stuck trying to install a single bundle that needs to wrap some non-OSGi dependencies. Method 1 above generates the wrap stuff (<feature> and wrap: protocol). All I get is the following error:
Failed to execute goal org.apache.karaf.tooling:karaf-maven-plugin:4.1.0:assembly (default-assembly) on project EnoceanBridgeAdmin: Unable to build assembly: [wrap/0.0.0]
EnoceanBridgeAdmin is the karaf-assembly packaging that I'm trying to build. It has a dependency on the bundle that contains the generated feature file (where wrap is referenced):
<dependency>
<groupId>net.winnall.enocean.service.impl</groupId>
<artifactId>EnoceanBridgeSASS.Impl</artifactId>
<version>0.99.99</version>
<type>xml</type>
<classifier>features</classifier>
</dependency>
So my questions:
Is method 1 above a correct usage?
Can I automatically generate a feature file to use method 2?
Will the error message disappear after I've got method 1 or 2 sorted?
Steve
I have resolved this myself.
Method 1 does not work because karaf-maven-plugin generates <feature> definitions for wrap. These cause the error mentioned above. Apparently – at least with Karaf 4.1.0 – the wrap: protocol is used in a feature file without a prior <feature> definition.
Method 2 (writing the feature file yourself) is thus the only viable option because of the behaviour of karaf-maven-plugin.
Yes, the error message disappeared :-)

Understanding Maven scoping better

I have been struggling to figure out what's the use of scoping that is provided by Maven
as mentioned here.
Why should you not always have compile time scoping? Real life examples would be really appreciated.
The compile scoped dependencies are only used during compilation.
The test scoped ones -- only during tests. Say you have tests using junit, or easymock. You obviously do not want your final artifact to have a dependency on them, but would like to be able to just depend on these libraries while running your tests.
Those dependencies which are marked provided are expected to be on your classpath when you're running the produced artifact. For example: you have a webapp and you have a dependency on the servlet library. Obviously, you should not package it inside your WAR file, as the webapp container will already have it and a conflict may occur.
One of the reasons to have different scopes for dependencies is that different parts of the build can depend on different dependencies. For example, if you are only compiling your code and not executing any tests, then there is no point in having Maven downloading your test dependencies (if they're not already present in your local repository, of course). The other reason is that not all dependencies need to be placed in your final artifact (whether it's an assembly, or WAR file), as some of the dependencies are only used during the build and testing phases.
compile
Will copy these jar files into prepared War file.
Ex: hibernate-core.jar need to have in our prepared War.
provided
These jars will be considered only at complie time and test time
Ex:
servlet.jar will be provided by deployed server, so no need to provide from our prepared War file.
test
These jars are only required for running test classes.
Ex: Junit.jar will be required only for running Junit test classes, no need to deploy these.
Scopes are quite well explained in here:
https://maven.apache.org/pom.html#Dependencies
As a reference, I copied the paragraph:
scope: This element refers to the classpath of the task at hand
(compiling and runtime, testing, etc.) as well as how to limit the
transitivity of a dependency. There are five scopes available:
compile
- this is the default scope, used if none is specified. Compile dependencies are available in all classpaths. Furthermore, those
dependencies are propagated to dependent projects.
provided - this is
much like compile, but indicates you expect the JDK or a container to
provide it at runtime. It is only available on the compilation and
test classpath, and is not transitive.
runtime - this scope indicates
that the dependency is not required for compilation, but is for
execution. It is in the runtime and test classpaths, but not the
compile classpath.
test - this scope indicates that the dependency is
not required for normal use of the application, and is only available
for the test compilation and execution phases.
system - this scope is
similar to provided except that you have to provide the JAR which
contains it explicitly. The artifact is always available and is not
looked up in a repository.
there are a couple of reasons that you might not want to have all dependencies to be default compile scope
reduce the size of final artifact(jar,war...) by indicating different scope.
when you have a multiple-modules project, you have ability to let each module have it's own version of dependency
avoid class version collision by provided scope, for instance if you are going deploy a war file to weblogic server, you need to get rid of some javax jars, like javax.servlet, javax.xml.parsers, JPA jars and etc. otherwise you might end up with class collision error.

Struts2 Dispatcher initialization failure with maven - jetty

I have the following problem:
I am working on a Web-Project using Struts2 with Tiles supported by a combination of Hibernate and Spring. In addition I am using Maven (which I'm new to) and the jetty server container. So my coworker told me what to do: just check out the project from the svn-repository and run the command (sudo) mvn jetty:run. According to my coworker this should work just fine, as it does when he checks out the project. But I always get the same error:
2011-08-22 10:09:20,568 ERROR org.apache.struts2.dispatcher.Dispatcher.error:38 -
Dispatcher initialization failed
Unable to load configuration. - [unknown location]
I already tried to re-check out the project, cleaned and updated maven, but still, the same error.
I think it has something to do with a missing struts2 .jar-file, but I thought maven downloads all necessary libraries automatically. Please give me a hint what could be missing, I'm sure it's something simple I overlooked.
Thanks in advance.

Resources