I am a beginner with gradle and i need help please.
If i want my own libs and extern libs (vendor) where shoud i put these libs (extern and personnal)?
thank you for helping !
(example: a screenshot of my current dir structure)
Gradle gives you a few different options for this.
In all cases, there is no fixed folder structure for local file dependencies, and you are free to chose what you like. libs seems fine to me.
All examples below are in Groovy DSL. Refer to the Gradle user guide for the Kotlin DSL variants.
File dependencies
To add dependencies from a local folder, you can use the Project.file() or Project.fileTree() methods. Both take a path that is resolved relative to the project directory.
dependencies {
implementation files('libs/a.jar') // Single file
implementation files('libs/b.jar', 'libs/c.jar') // Multiple files
implementation fileTree('libs') // All files in a folder
implementation fileTree('libs') { include '*.jar' } // All jar files in a folder
}
See File dependencies in the user guide for more information.
Flat directory repository
You can also declare a flat directory repository. Here the path is relative to where you invoke Gradle from, so to make it consistent, you should make it absolute (using projectDir or rootDir).
repositories {
flatDir name: 'local-libs', dirs: "$projectDir/libs" // The name is optional
}
You can then declare dependencies using the normal format with a group, name and version. However, as group is ignored so you can either write whatever you like, or just leave it out:
dependencies {
implementation 'mysql:mysql-connector-java:5.1.49' // Real Maven coordinates
implementation ':mysql-connector-java:5.1.49' // Short-hand for local coordinates
}
See this section for more information.
Maven repositories and separate Gradle projects
Because the above two methods do not include module metadata, meaning you have to define and include all transitive dependencies manually, it can quickly get hairy to maintain.
If possible, try and use Maven repositories. Especially for third-party dependencies. You can even define a local repository to take advantage of metadata files (e.g. .pom files or .module) if you can't use remote ones like Maven Central or JCenter.
When depending on other Gradle projects, if they are related, you could possibly structure them as a multi-projects instead of building and putting them into a "libs" folder.
If they are not related, you could also publish them to a Maven repository (either local or remote), or you could look into composite builds, though this is a bit of an advanced topic.
A note on the MySQL Connector software license
Lastly, in case you are not aware, the community edition of MySQL Connector/J version 5.x is licensed under GPL v2. It means your own application also needs to be licensed under GPL v2.
Later versions like 8.x (I am unsure about 6 and 7) are licensed dually under GPL v2, but has a provision called The Universal FOSS Exception that allows you to link and use the library in your own application without affecting your own license.
If this is just a hobby project, no one will come knowing on your door if you breach the license. But if it is for a company, be careful or you might bring it into legal trouble. Oracle does audits from time to time and they are not known to let an opportunity slide for slapping you in the face with a big license fee.
Related
I am using gradle and its local repository is at \.gradle\caches\modules-2\files-2.1 which has all the downloaded jar but not my modules.Is there any specific place I should be searching it for ?
I need it as is in settings.gradle I am having a dependency path specified like :
include ':model'
project (':model').projectDir = new File(settingsDir, './model')
in a new project. Also I don't want to give path in that way because if I have a dependency from multiple projects on this project then mentioning path will be difficult and weird.
How can I make gradle search it from local maven or gradle repositories.
I'm still not sure what is being asked here, and I suspect there is some confusion over how multi-project builds work. So I'm going to attempt to provide a general-purpose answer.
The first question you need to answer is whether you're interested in dependencies between projects that are part of the same build — as in part of a multi-project build — or in separate builds.
Project dependencies (multi-project builds)
Project dependencies are covered in the user manual and only apply to multi-project builds. They use a logical path, using colons as 'path' separators, to specify the location of the target module, like so:
dependencies {
implementation project(":model")
}
At this point, Gradle needs to know where ":model" exists on the file system. There's no getting around that. You have a few options:
Follow the convention of directory structure matching the logical path structure, i.e. have a MyBigProject/model directory containing the ":model" child project
Specify the file path of ":model" in settings.gradle, e.g. with project(":model").projectDir = new File(rootDir, "unusual/path/to/model")
Automate the discovery of projects
The most common approach is the first one. The second is not unusual, particularly if you want to put child projects into a separate directory, like subprojects — something the build of Gradle itself does. I haven't seen the last option done, and I don't know whether it runs into problems.
For the sake of completeness, and at your own risk if you use something like it, here's an example of automatic discovery of projects in the settings.gradle file:
rootDir.eachDir { File dir ->
if ("build.gradle" in dir.listFiles()*.name) {
include dir.name
}
}
This fragment basically looks for directories within the root project folder that have a build.gradle file in them and adds them as child projects. The child projects' directory names become the projects' names.
It's not particularly clever, and you should really use different names for the build files, but it may give you some ideas to work with.
Non-project dependencies
As with project dependencies, Gradle needs to know where to get the corresponding JAR or other form of artifact for a specified module. You normally specify Maven Central or something similar for this, but there are other useful, but less common, options:
Copy a project's artifacts into the local Maven repository — both the Maven Plugin and Maven Publish Plugin support this
Publish to a Maven-compatible repository using a file:// URL rather than an HTTP/HTTPS one, which protects your projects from corruption of Maven Local
Worth noting is that Gradle supports composite builds that allow you to substitute a normal dependency with (effectively) a project dependency from another build. So if model were part of a separate build but you had the source code and build locally, you could make changes and immediately test them in another build's project without going through the whole "install" intermediate step that's common in the Maven world (and Gradle pre-composite-builds).
Hope all this makes sense.
I'm a professional services consultant for a software product company. Our product provides a Java API so field customizations can be built. These extensions are dependent upon up to 50 individual JAR files - some proprietary to our product, others are open source.
I don't want to hard code the dependencies in the POM for several reasons - mostly because each new release will depend on slighty different versions of the open source artifacts. When a new product is released, I simply want to point to the new installation folder and rebuild the extensions.
So, I'm trying to create a Maven plugin that takes a reference to the installation folder, recurses through the folders, and automatically adds all *.jar files to the compile-time classpath.
I tried this:
Map<?, ?> context = getPluginContext();
MavenProject maven = (MavenProject) context.get("project");
List<String> classpath = maven.getCompileClasspathElements();
// start adding additional elements to the classpath
That snippet executes during the initialize phase, but the classpath reverts by the time compiling starts and the compile fails. Am I even going about this the right way?
This doesn't sound like the way to go.
Rather try to analyze how the installation folder is assembled. Try to create this installation folder with Maven, too, and provide a POM with all used dependencies (BOM - Bill of Materials).
Then use this BOM-POM as dependency (scope=provided) in your extensions.
I've inherited a few maven projects which have added a /dependencies directory to capture Java jar libraries that aren't part of the project war and must be installed by a DevOps into a Tomcat installation.
The libraries in this directory seem to fall into four categories:
"provided" scope libraries,
downstream dependencies of those provided libraries, and
discoverable implementations of api jars
"mystery" libraries, i.e., not available in an external repository, and maybe unsure where they ever came from.
Is there a strategy to get Maven to help manage these dependencies and perhaps fetch them for external install?
There are probably several strategies to choose from.
Number one: leave it as it is. If it works and the build is reproducible (on different environments) that seems one valid solution.
The "mystery" part of the build might not be more of an issue for new people working with it.
I think it is valid to create an own maven module to be delivered to the infrastructure team. This module can contain the jars in the /dependencies folder.
What you would need to do is create a pom.xml and add all dependencies currently in that directory (of course not the transitive ones). The magic ones would need to go in a repository proxy (nexus, artifactory, ...). If you don't have a maven repository yet: you want one! (its easy to setup and it does help a lot!)
I would then use the assembly plugin or some ant task to build the zip do be delivered. So the infrastructure team is able to just unzip / copy the files where they need to be. This step can then even be scripted (so the upload / unzip is done through SSH or something like that).
This is probably only one way to do it. I would assume to resolve the jar's in the /dependencies directory may be a bit of a pain.
The advantage is obviously that you document and simplify the management of those libraries. I would also assume if you update some of them it is easier across branches to merge since there are no binary files around. So it may be worth the effort.
Context:
I have a multimodules maven project that looks like:
Root ---- ModuleA
ModuleB
ModuleC
ModuleD
.......
They are around 25 modules under the Root:
A few of them represent the core of the application (5 modules)
And each of the remaining modules represent the business processes implementation related to a type a customers. These modules are completely independant among each others.
When packaging or releasing the 'Root' project, the artifact generated is a single ZIP file aggregating all the JARs related to 'Root' modules.
The single ZIP file is generated according to an assembly descriptor, it represents the delivery artifact.
At deployment time on the target environment, the single ZIP is unziped under a directory where it is consumed (class loaded) by an 'engine', a java web application that provides the final services.
Constraints
The 'business constraints' from one side,
And the willing to reduce regressions between different versions on
the other side
The above constraints lead us to adopt the following release scenarios:
Either we release the Root and ALL its submodules. It means that
the resulting ZIP will aggegate all the submodules JAR with the same
version. The ZIP will contain something similar to:
[ModuleA-1.1.jar, ModuleB-1.1.jar, ModuleC-1.1.jar, ModuleD-1.1.jar,
......., ModuleX-1.1.jar].
Or we release the Root and A FEW of its submodules, the ones that we want to re update.
The resulting ZIP will aggegate all the submodules JAR : The released submodules will be aggregated with the last released versions, the unreleased submodules will be aggregated with another 'appropriate' version.
For example, if we made a such incremental release, the ZIP will contain something similar to [ModuleA-1.2.jar, ModuleB-1.1.jar, ModuleC-1.2.jar, ModuleD-1.1.1.jar,
......., ModuleX-1.1.2.jar].
These 2 scenarios were made possible by:
Either declaring the modules as MAVEN MODULES 'module' for the first
scenario
Or declaring the modules as MAVEN DEPENDENCIES 'dependency' for
the second scenario, the INCREMENTAL scenario.
Question
Both scenarios are working perfectly BUT when we are in the 2nd scenario (INCREMENTAL), the maven-release-plugin:prepare is uploading to the SCM (svn) all the modules [ModuleA, ModuleB, ModuleD, .... ModuleX], it is uploading the released and the non released ones, whereas the 'non released modules' are declared as 'dependency' in the pom and not as a 'module'.
1/ IS THERE a way to avoid uploading the 'non released' modules ? Is there a way to inject an 'exlcude directrory list' to SCM svn provider ?
2/ A MORE global question, does the approach used is a correct one ? Or is it an anti pattern usage ? In that case, what should be the alternative ?
Thank you.
To me, your approach looks like an antipattern. I recommend to only have projects in the same hierarchy that you want to release together. Projects with a different release lifecycle should live on their own - otherwise you will keep running into the issues you mentioned. If you run the release plugin from a root directory (multi-module setup), all of the content of that root directory will be tagged in SVN.
In your case, I would probably create the following hierarchies:
Core
One per customer type
Potentially one per type to bundle them (zip), depending on your structure
I would group it by the way you create the release. It might mean that you have to run the release plugin a couple of times instead of just once when you make a change e.g. in Core, but it will be a lot cleaner.
Your packaging project will then pull in all of the dependencies and package/assemble them.
If you have common configuration options, I recommend to put them into a common parent pom. This doesn't have to be your root (multi-module) pom.
Did you try to run the maven-release-plugin with -r argument + the list of all modules you want to release?
Basically, this argument allows you to specify the list of modules against which the maven command should be performed. (if you omit it: all submodules will be included, this the default behavior)
See more details about this command line here.
I never try to use it with the maven-release-plugin, and I don't know if it will work, especially regarding SCM operations.
My need is pretty basic but I could not find any clean answer to it: I simply need to be able to distribute a resource in a multi-module project.
Let us consider for example the LICENSE file, which I hereby assume to be the same for all modules. I prefer not to manually copy it into each and every module because the file could change over time. I also prefer not to statically link to resources (even if using relative paths) outside the project folder, because the modular structure can possibly change too.
Is there any plugin that can be used to robustly guarantee that each module is given the required file? It would be equally acceptable for such copy to be obtained by exploiting the POM of the parent project or directly performed by the super project in the modular hierarchy.
you could use the assembly and the dependency plugins.. did you stumble over that link?
http://www.sonatype.com/people/2008/04/how-to-share-resources-across-projects-in-maven/
it describes that option ..its from 2008, but maven is around for quite some time.. so I guess its more or less up to date
edit regarding comment
Another option is the maven-remote-resources-plugin.
For a more detailed example see:
http://maven.apache.org/plugins/maven-remote-resources-plugin/examples/sharing-resources.html
Since their intro speaks actually for itself, I quote (maven.apache.org)
This plugin is used to retrieve JARs of resources from remote repositories, process those resources, and incorporate them into JARs you build with Maven. A very common use-case is the need to package certain resources in a consistent way across your organization: at Apache it is required that every JAR produced contains a copy of the Apache license and a notice file that references all used software in a given project.