Short version of question:
Is there a way of telling gradle not to resolve dependencies? I know I can skip single task with -x switch but resolving dependencies isn't performed though some task I guess, to I don't know how to do it.
Long version:
Right now I can run tests from gradle with simple 'gradle test' which performs gathering dependencies, building and running tests..
But I'd also like to run tests using gradle on some other machine which can't download dependencies from maven. I thought that I could just perform some packaging which would download all dependencies to some lib folder, and I could expand tests classpath (in that task) to this folder. The problem is, that gradle still tries to contact maven when I run 'gradle myTests'. Is there a way of preventing resolving dependencies for this single task?
There's the --offline flag. Alternatively, you can declare a flatDir rather than a maven repository whenever the build should be run with "local" dependencies.
For my use case, my internet is restricted so I would setup dependencies while I can still have full access. And when I'm restricted, go to Preferences, search for Gradle, and check "Offline work".
Of course I'll have to turn it back on whenever new dependencies are added.
Related
I have a Maven package I've hosted on GitHub package registry.
Whenever I make an update to the package I run mvn deploy to publish the changes, but if I simply run gradle install on the dependent application it doesn't seem to install the latest version of the package
(not sure if settings.xml is relevant to this question so I removed it, but it can be seen at the link to my previous question).
I had a similar issue with using the latest snapshot version of the package in another dependent, which was using Maven as the package manager/build tool instead of Gradle. That was resolved by checking a box to "always update snapshots" in Maven settings. I have checked the box in this project as well, but it doesn't seem to resolve the issue now.
What I have tried:
Invalidating cache and restarting IntelliJ
reimporting all gradle projects
deleting the dependency from my build.gradle and then reimporting projects and install, followed by adding it back and reimporting all projects and install
running ./gradlew build -x test --refresh-dependencies (test disabled because they were failing)
This is the log after I run gradle install:
4:07:08 PM: Executing task 'install'...
> Task :compileJava UP-TO-DATE
> Task :processResources UP-TO-DATE
> Task :classes UP-TO-DATE
> Task :jar SKIPPED
> Task :install
BUILD SUCCESSFUL in 2s
3 actionable tasks: 1 executed, 2 up-to-date
4:07:10 PM: Task execution finished 'install'.
In my build.gradle I use the following syntax for my dependency (under dependencies):
compile('com.companyname:packagename:0.0.3-SNAPSHOT')
and this is what I have under repositories:
maven {
url "https://maven.pkg.github.com/companyname/packagename"
credentials {
username "TaylorBurke"
password "*****************"
}
}
Not sure if it is related, but when I go into my Maven settings to try and update the repository I get this error:
So there it is, I think I've included everything. Is it a configuration issue with Maven, Gradle, or IntelliJ?
Edit: because it has been suggested to close this question I am pointing out that the link to the other question does not address installing with Gradle, it simply addresses an error after running mvn deploy. I have already deployed the new package successfully and can get the new version from my other application. My problem is specific to gradle install. Even though the accepted answer mentions he had a similar problem using Gradle (but my problem is not with deploying either) he goes on to say that snapshot versions would provide a solution to the problem expressed, and I am already using a snapshot version in this package. That question is clearly quite different and not at all related to mine.
You have tried quite a few things with IntelliJ, but the problem happens when you run the build from the command line (./gradlew build). That should be a good indication that the problem is not with IntelliJ.
By default, Gradle will cache changing dependencies (e.g. SNAPHOST dependencies) for 24 hours. During that time, it will not ask the repository for newer versions. So if you publish a new version under the same name, Gradle might not see it for another day.
Using the --refresh-dependencies option will make Gradle ignore the cache, and thereby download the artifacts again.
You can also change the cache retention period through a ResolutionStrategy. You can also configure it to always check for changed dependencies if you like.
Read more about dynamic dependencies here: https://docs.gradle.org/current/userguide/dynamic_versions.html
If you are curious, the Gradle artifact cache is by default located in $USER_HOME/.gradle/caches/modules-2/files-2.1 (the numbers may be different depending on which version of Gradle you are using). This cache is unrelated to the one you mention in IntelliJ.
Also, the authentication error in the IntelliJ maven repository browser is because your credentials are in the Gradle configuration and not in IntelliJ. So this is also unrelated to Gradle.
I have Maven dependencies with scope test.
I add flag Dmaven.test.skip=true but Maven still brings test dependencies.
Is there a way not to bring test dependencies if I want to build only the production part?
This is how maven works - it First tries to check that ako dependencies are available, no matter their scope. Only then it continues to test phase to find out that tests should not be executed.
A possible workaround is to define your test dependencies in a separate test maven profile, which is not applied when you do not want to run tests. Profiles are resolved before any dependence are downloaded, therefore if test dependencies are not added by the profile to the effective pom, they are not downloaded at all.
the flag -Dmaven.test.skip will only skip compilation and execution of your tests within the project you run.
Its often better to use -DskipTests as this will compile the test classes but not run them. See surefire documentation.
This has nothing to do with dependencies. Those are loaded into the classpath depending on their scope and what plugins require. The surefire plugin requires resolution of scope test as it runs the unit tests.
If there are dependencies of scope test which you do not want to use you need to remove them or exclude them if they come in via transitive dependencies (dependencies of dependencies). You can execute a mvn dependency:tree to figure out why jar are in the project.
If you add some dependencies for test scope, maven will first check if the dependency is available or not then it checks the scope.
You can create a maven profile, add test dependencies under the profile and trigger the profile when -Dmaven.test.skip or -Dmaven.test.skip=true option is not present. In this way you can keep your build command unchanged.
You can check this simple project manage-test-dependencies-in-maven-the-proper-way to understand it better.
We are new to Gradle and dependency resolution. I am in the process of creating pom.xml files for all our internally-generated artifacts and want to set up a job in our Jenkins server to verify the dependencies are properly defined and not conflicting (i.e. LibA requires x-1.0.jar, LibB requires x-1.1.jar, and AppY requires both LibA and LibB).
As such, I've set up a dummy project in SVN that simply includes a bunch of our internal artifacts as dependencies. Following TTD, I intentionally included some errors in the declarations (i.e. group and name, but not version). Sure enough, those dependencies can't be found.
But when I run this build with gradle (i.e. gradle dependencies) it includes all the failure messages but still says the build succeeded! Not good!
How can I, using Gradle/Jenkins, set up an automated job that will verify all dependencies are found?
There is no built-in task that resolves all dependencies and fails if a dependency isn't found. (IDE tasks are graceful in case of missing dependencies.) But you can easily write your own:
task resolveDependencies {
doLast {
configurations.all { it.resolve() }
}
}
gradle dependencies by design displays Gradle project dependencies reporting (if applicable) if given dependency cannot be resolved (a red text FAILED next to an unresolved dependency). To get an error use some task that depends on resolving dependencies for given configuration(s) like gradle check.
Updated. Gradle is smart in determining if given tasks are required to be executed. Therefor in case there is no source files to compile (compilation requires dependent classes/JARs to be resolved) gradle check can notice that executing compileJava/compileTestJava tasks is not needed (tasks are skipped as up-to-date). You can force it by adding any Java source file into src/main/test (tests requires also production dependencies (from compile configuration)).
This is just a workaround, there is probably a better way to do that (and I hope someone else will present it here).
When I build my application with maven, I run mvn clean install. As a part of the install lifecycle, I run appengine:devserver_start from Google's GAE Maven plugin. This appears to be already bound to a step in the lifecycle and therefore it reruns some build steps from the beginning, even though me running mvn install did those. For example, the resources step is rerun. I had my own Java script run to download the latest resources for my build. But because of appengine:devserver_stop, I need to uselessly run this cript again because the resources step is re-executed.
I can think of two ways I can avoid this, but I'm not sure how to configure both ways. The first would be to somehow skip re-running build steps that I've already run. The other way would be to change the Maven POM properties just for the plugin execution. I have a Maven property set, either to true or false, that I can use to set the skip setting for the Java script I use during resources (because I run this script using the exec-maven-plugin). Think of this as a Maven property that can be set with the -D flag. Can I have this property changed just for the plugin?
If you are having trouble thinking about my scenario, consider what happens when you run mvn compile install. All build lifecycle steps until compile will run, then all compile steps until install will run, including compile.
A common/easy way to solve this kind of problems is to use maven profile. Just create a new profile that includes the plugin with preferred phases.
You should probably don't fight with it and just run clean appengine:devserver_start instead of clean install. Read my answer here for a more detailed explanation:
https://stackoverflow.com/a/17638442/2464295
I have a maven multi module project with several modules. I want to deploy them (mvn deploy) only if they all pass a full mvn install (which includes the tests).
Currently, I run a mvn install on the project. If all modules pass, I run mvn deploy to do the deployment. The problem I see is the waste of time calling mvn twice (even if I skip tests on the second run).
Does anyone have an idea on this?
EDIT: I have learned that using Artifactory as a repository manager and the maven-artifactory-plugin with your maven setup will add the atomic deploy behaviour to the mvn deploy command. See the Build Integration section in the Artifactory documentation.
[DISCLOSURE - I'm associated with JFrog. Artifactory creator.]
Take a look at the deployAtEnd parameter of Maven Deployment plugin: http://maven.apache.org/plugins/maven-deploy-plugin/deploy-mojo.html
This is a bit tricky. Maven is not atomic when it executes the build life-cycle. So a broken set of artifacts may end up in a repository.
One solution I know is Nexus Pro: http://www.sonatype.com/Products/Nexus-Professional/Features - it allows you to promote builds or define certain repos as staging. So only verified versions get promoted to be used. Maybe artifactory has something similar - I just don't know.
If that solution is too expensive you probably need to create a cleanup build or profile to remove artifacts that where already uploaded. My first guess would be to write a Maven plugin to use the the proxy remote API or maybe the maven features are already sufficient. But since deploy means update the meta-data xml files too I dont think there is a delete - not sure on this either.