convertopom task in Ivy. How to add more artifacts to publications field - maven

When I use convertpom task in Ivy to convert a pom.xml to ivy.xml, I get default publications
<publications>
<artifact name="XYZ" type="jar" ext="jar" conf="master"/>
</publications>
How do I modify pom.xml, so that more artifacts are added to this in conversion. Where does converttopom pick up the artifacts from. Also, how do I change the type?
Is it possible to override these in ivy:publish call?
found this piece of code in convertpom ant task, not sure how it is used.
private void addSourcesAndJavadocArtifactsIfPresent(
PomModuleDescriptorBuilder mdBuilder, ParserSettings ivySettings) {
if (mdBuilder.getMainArtifact() == null) {
// no main artifact in pom, we don't need to search for meta artifacts
return;
}
ModuleDescriptor md = mdBuilder.getModuleDescriptor();
ModuleRevisionId mrid = md.getModuleRevisionId();
DependencyResolver resolver = ivySettings.getResolver(
mrid);
if (resolver == null) {
Message.debug("no resolver found for " + mrid
+ ": no source or javadoc artifact lookup");
} else {
ArtifactOrigin mainArtifact = resolver.locate(mdBuilder.getMainArtifact());
if (!ArtifactOrigin.isUnknown(mainArtifact)) {
String mainArtifactLocation = mainArtifact.getLocation();
ArtifactOrigin sourceArtifact = resolver.locate(mdBuilder.getSourceArtifact());
if (!ArtifactOrigin.isUnknown(sourceArtifact)
&& !sourceArtifact.getLocation().equals(mainArtifactLocation)) {
Message.debug("source artifact found for " + mrid);
mdBuilder.addSourceArtifact();
} else {
// it seems that sometimes the 'src' classifier is used instead of 'sources'
// Cfr. IVY-1138
ArtifactOrigin srcArtifact = resolver.locate(mdBuilder.getSrcArtifact());
if (!ArtifactOrigin.isUnknown(srcArtifact)
&& !srcArtifact.getLocation().equals(mainArtifactLocation)) {
Message.debug("source artifact found for " + mrid);
mdBuilder.addSrcArtifact();
} else {
Message.debug("no source artifact found for " + mrid);
}
}
ArtifactOrigin javadocArtifact = resolver.locate(mdBuilder.getJavadocArtifact());
if (!ArtifactOrigin.isUnknown(javadocArtifact)
&& !javadocArtifact.getLocation().equals(mainArtifactLocation)) {
Message.debug("javadoc artifact found for " + mrid);
mdBuilder.addJavadocArtifact();
} else {
Message.debug("no javadoc artifact found for " + mrid);
}
}
}
}

This really demonstrates one of the key differences between Maven and ivy...
An ivy file explicitly lists all the files contained in the module. A Maven POM on the other hand does not. Instead zero or more additional files can be stored in Maven each with a different "classifier" to make the filename unique.
I can't see any way to build the complete "publications" section in ivy without having access to the Maven module's filesystem. Are you using a Maven repository manager? Nexus has a REST API that perhaps you could invoke to obtain all files in a module (Just a thought)
Another idea is to submit a request to extend the convertpom task. Create some optional child tags that enable you to list the available classifiers:
<ivy:convertpom pomFile="pom.xml" ivyFile="ivy.xml">
<classifier name="sources"/>
<classifier name="javadoc"/>
<classifier name="archive" type="tar.gz/>
</ivy:convertpom>
I don't see this change being very popular (or useful). Most people are converting in the other direction using the makepom task.

Related

How can I reuse an IvyPublication in a custom PublishToIvyRepository task?

I have a build.gradle file with an artifact I am publishing, following the guidelines given in the Ivy publishing documentation.
publishing {
publications {
ivy(IvyPublication) {
from components.java
descriptor.withXml {
asNode().info[0].appendNode("description", description)
}
}
}
}
I have a separate PublishToIvyRepository task, which I would like to configure so that it goes to a different repository that normal, but uses the same publication as the above code. My initial attempt is this:
task publishToIvyLocal(type: PublishToIvyRepository) {
repository = mySpecialRepository
publication = project.publishing.publications[0]
}
However, this doesn't seem to work. If I put it before the publishing {} block above, I get the following error:
Cannot configure the 'publishing' extension after it has been accessed.
I'm guessing that project.publishing.publications[0] isn't the best way to reuse this publication.
How can I reuse an IvyPublication in a custom PublishToIvyRepository task?
There is no need to create a PublishToIvyRepository task on your own.
Applying the 'ivy-publish' plugin does the following:
[...]
Establishes a rule to automatically create a PublishToIvyRepository task for the combination of each IvyPublication added (see Section 35.2, “Publications”), with each IvyArtifactRepository added (see Section 35.3, “Repositories”).
So, simply add your publication with your two repositories and two tasks will be created, one to publish the publication for each repository.
The created task is named publish«PUBNAME»PublicationTo«REPONAME»Repository, which is publishIvyJavaPublicationToIvyRepository for this example.
Some example code:
publishing {
publications {
mySpecial(IvyPublication) {
// configure publication
}
}
repositories {
ivy {
name = 'first'
// configure first repository
}
ivy {
name = 'second'
// configure second repository
}
}
}
This should create the following tasks:
publishMySpecialPublicationToFirstRepository
publishMySpecialPublicationToSecondRepository
Regarding the repository name:
The name for this repository. A name must be unique amongst a repository set. A default name is provided for the repository if none is provided.
Given your code sample I agree with lu.koerfer's answer but in case if you really need a custom publication task you can use project.afterEvaluate to access the publishing container after it has been configured:
project.afterEvaluate
{
customPublicationTask.publication = project.publishing.publications["ivy"]
// a publication can be accessed by its name
}

How to get dependencies from a gradle plugin using "api" or "implementation" directives

Background: Running Android Studio 3.0-beta7 and trying to get a javadoc task to work for an Android library (the fact that this is not available as a ready-made task in the first place is really strange), and I managed to tweak an answer to a different question for my needs, ending up with this code (https://stackoverflow.com/a/46810617/1226020):
task javadoc(type: Javadoc) {
failOnError false
source = android.sourceSets.main.java.srcDirs
// Also add the generated R class to avoid errors...
// TODO: debug is hard-coded
source += "$buildDir/generated/source/r/debug/"
// ... but exclude the R classes from the docs
excludes += "**/R.java"
// TODO: "compile" is deprecated in Gradle 4.1,
// but "implementation" and "api" are not resolvable :(
classpath += configurations.compile
afterEvaluate {
// Wait after evaluation to add the android classpath
// to avoid "buildToolsVersion is not specified" error
classpath += files(android.getBootClasspath())
// Process AAR dependencies
def aarDependencies = classpath.filter { it.name.endsWith('.aar') }
classpath -= aarDependencies
aarDependencies.each { aar ->
System.out.println("Adding classpath for aar: " + aar.name)
// Extract classes.jar from the AAR dependency, and add it to the javadoc classpath
def outputPath = "$buildDir/tmp/exploded-aar/${aar.name.replace('.aar', '.jar')}"
classpath += files(outputPath)
// Use a task so the actual extraction only happens before the javadoc task is run
dependsOn task(name: "extract ${aar.name}").doLast {
extractEntry(aar, 'classes.jar', outputPath)
}
}
}
}
// Utility method to extract only one entry in a zip file
private def extractEntry(archive, entryPath, outputPath) {
if (!archive.exists()) {
throw new GradleException("archive $archive not found")
}
def zip = new java.util.zip.ZipFile(archive)
zip.entries().each {
if (it.name == entryPath) {
def path = new File(outputPath)
if (!path.exists()) {
path.getParentFile().mkdirs()
// Surely there's a simpler is->os utility except
// the one in java.nio.Files? Ah well...
def buf = new byte[1024]
def is = zip.getInputStream(it)
def os = new FileOutputStream(path)
def len
while ((len = is.read(buf)) != -1) {
os.write(buf, 0, len)
}
os.close()
}
}
}
zip.close()
}
This code tries to find all dependency AAR:s, loops through them and extracts classes.jar from them, and puts them in a temp folder that is added to the classpath during javadoc generation. Basically trying to reproduce what the really old android gradle plugin used to do with "exploded-aar".
However, the code relies on using compile dependencies. Using api or implementation that are recommended with Gradle 4.1 will not work, since these are not resolvable from a Gradle task.
Question: how can I get a list of dependencies using the api or implementation directives when e.g. configuration.api renders a "not resolvable" error?
Bonus question: is there a new, better way to create javadocs for a library with Android Studio 3.0 that doesn't involve 100 lines of workarounds?
You can wait for this to be merged:
https://issues.apache.org/jira/browse/MJAVADOC-450
Basically, the current Maven Javadoc plugin ignores classifiers such as AAR.
I ran in to the same problem when trying your answer to this question when this error message kept me from resolving the implementation dependencies:
Resolving configuration 'implementation' directly is not allowed
Then I discovered that this answer has a solution that makes resolving of the implementation and api configurations possible:
configurations.implementation.setCanBeResolved(true)
I'm not sure how dirty this workaround is, but it seems to do the trick for the javadocJar task situation.

Gradle publish multiple independent artifacts

I've got a project that builds using Gradle and the ivy-publish plugin. In addition to building a JAR, build.gradle also executes a run task that executes XmlFileGenerator.main(), which generates 5 XML files (call them A, B, C, D, and E). I'm looking to publish each of these XML files to our Ivy repository; each should have the same group and version but a different module and a different filename, and each should have its own ivy.xml that lists only itself.
I'm able to set the filename of the file that's published, but the module name remains the same as my project's name, and as a result all of my XML files are published under the same module name instead of under independent ones.
So for example, I want A.xml to be published at {myLocalIvyRootDir}\my-group\A\{version}\xmls\A-{version}.xml and I want B.xml to be published at {myLocalIvyRootDir}\my-group\B\{version}\xmls\B-{version}.xml. But instead A is published at {myLocalIvyRootDir}\my-group\my-project\{version}\xmls\A-{version}.xml and B is published alongside it at {myLocalIvyRootDir}\my-group\my-project\{version}\xmls\B-{version}.xml.
Here's the relevant subset of build.gradle (showing only A but not B-E):
apply plugin: 'ivy-publish'
group = 'my-group'
publishing {
publications {
ivy(IvyPublication) {
artifact jar
}
aXml(IvyPublication) {
artifact('target/A.xml') {
name = 'A'
extension = 'xml'
type = 'xml'
}
}
}
}
mainClassName = 'my-group.my-project.XmlFileGenerator'
I've tried defining the module property on the publication with this code:
aXml(IvyPublication) {
module 'A'
artifact('target/A.xml') {
name = 'A'
extension = 'xml'
type = 'xml'
}
}
But I get the following error message:
> org.gradle.api.internal.MissingMethodException: Could not find method module() for arguments [A] on org.gradle.api.publish.ivy.internal.publication.DefaultIvyPublication_Decorated#32384c50.
And I've tried changing the rootProject.name dynamically with code like:
publishing {
publications {
ivy(IvyPublication) {
artifact jar
}
project.metaClass.getName {"A"}
aXml(IvyPublication) {
artifact('target/A.xml') {
name = 'A'
extension = 'xml'
type = 'xml'
}
}
}
}
That produced no errors, but also no change in behavior.
I feel like I'm probably just missing something small, but don't know what it is. Can anyone point me in the right direction?
It turned out that this particular project was still pointing to Gradle 1.6, before these properties were made available (they were added in 1.7). So all that was needed was to point to 1.7, and everything worked as intended.

How to programmatically list all transitive dependencies, including overridden ones in Maven using DependencyGraphBuilder?

This is similar to other questions (like this), but I want to be able to do this with the latest API's. The maven-dependency-plugin:tree verbose option has been deprecated and does nothing in the latest (2.5.1) code, so there is no good example of how to do it.
I believe Aether utility class from jcabi-aether can help you to get a list of all dependencies of any Maven artifact, for example:
File repo = this.session.getLocalRepository().getBasedir();
Collection<Artifact> deps = new Aether(this.getProject(), repo).resolve(
new DefaultArtifact("junit", "junit-dep", "", "jar", "4.10"),
JavaScopes.RUNTIME
);
If you're outside of Maven plugin:
File repo = new File("/tmp/local-repository");
MavenProject project = new MavenProject();
project.setRemoteProjectRepositories(
Arrays.asList(
new RemoteRepository(
"maven-central",
"default",
"http://repo1.maven.org/maven2/"
)
)
);
Collection<Artifact> deps = new Aether(project, repo).resolve(
new DefaultArtifact("junit", "junit-dep", "", "jar", "4.10"),
"runtime"
);
The only dependency you need is:
<dependency>
<groupId>com.jcabi</groupId>
<artifactId>jcabi-aether</artifactId>
<version>0.7.5</version>
</dependency>
Including my approach here, as the additional steps may become part of your actual use case, esp. if working on a composite or multi-module project.
(Maven 3, my runtime was 3.6; no direct dependency on Aether)
In my case I wanted to resolve the dependency tree of a specific artifact foo-runtime from inside my plugin; however,
some of the dependency versions were only available in its parent's foo-parent POM (i.e. absent in the foo-runtime's own POM).
The parent POM also had additional details, such as exclusions for some of foo-runtime's dependencies - via dependencyManagement.
So I had to:
explicitly load the parent's model,
link the child model to it,
fill in the missing version numbers of the child model (still not sure why Maven didn't automatically resolve these after linking the parent), and then
run dependency resolution for the child.
To avoid model building from scratch, I derived the model of of foo-runtime using an existing artifact foo-api (which in my case is always guaranteed to be present in the Maven project being built). All these artifacts share the same groupId.
#Component
public LifecycleDependencyResolver resolver;
// ...
// `artifacts` contains all artifacts of current/reactor `MavenProject` obtained via `project.getArtifacts()`
private Set<Artifact> resolveRuntimeDeps(Set<Artifact> artifacts) throws MojoExecutionException {
// foo-api will always be present; use it to derive coordinates for foo-runtime
Artifact fooApi = artifacts.stream().filter(artifact -> "foo-api".equals(artifact.getArtifactId()))
.findFirst().orElseThrow(() -> new MojoExecutionException("Unable to find foo-api"));
Collection<String> scopes = Arrays.asList("compile", "runtime");
MavenProject fooRoot = deriveProject(fooApi, "foo-parent");
Model fooRootPom = fooRoot.getModel();
MavenProject fooSrv = deriveProject(fooApi, "foo-runtime");
fooSrv.setParent(fooRoot);
// some foo-runtime deps depend on versions declared on parent pom; merge them
Map<String, Artifact> depMgt = fooRootPom.getDependencyManagement().getDependencies().stream()
.collect(Collectors.toMap(dep -> dep.getGroupId() + ":" + dep.getArtifactId() + ":" + dep.getType(), this::toArtifact));
for (Dependency d : fooSrv.getDependencies()) {
if (d.getVersion() == null) {
Artifact managed = depMgt.get(d.getGroupId() + ":" + d.getArtifactId() + ":" + d.getType());
if (managed != null) {
d.setVersion(managed.getVersion());
}
}
}
try {
resolver.resolveProjectDependencies(fooSrv, scopes, scopes, session, false, Collections.emptySet());
return fooSrv.getArtifacts();
} catch (LifecycleExecutionException e) {
throw new MojoExecutionException("Error resolving foo-runtime dependencies", e);
}
}
// load POM for another artifact based on foo-api JAR available in current project
private MavenProject deriveProject(Artifact fooApi, String artifactId) throws MojoExecutionException {
Model pom;
String pomPath = fooApi.getFile().getAbsolutePath().replaceAll("foo-api", artifactId).replaceAll("\\.jar$", ".pom");
try (InputStream fooRootPomData = new FileInputStream(pomPath)) {
pom = new MavenXpp3Reader().read(fooRootPomData);
pom.setPomFile(new File(pomPath));
} catch (IOException | XmlPullParserException e) {
throw new MojoExecutionException("Error loading " + artifactId + " metadata", e);
}
// set these params to avoid skips/errors during resolution
MavenProject proj = new MavenProject(pom);
proj.setArtifact(toArtifact(pom));
proj.setArtifactFilter(Objects::nonNull);
proj.setRemoteArtifactRepositories(Collections.emptyList());
return proj;
}
private Artifact toArtifact(Model model) {
return new DefaultArtifact(
Optional.ofNullable(model.getGroupId()).orElseGet(() -> model.getParent().getGroupId()), model.getArtifactId(),
Optional.ofNullable(model.getVersion()).orElseGet(() -> model.getParent().getVersion()), "compile", model.getPackaging(), null,
project.getArtifact().getArtifactHandler());
}
private Artifact toArtifact(Dependency dep) {
return new DefaultArtifact(dep.getGroupId(), dep.getArtifactId(), dep.getVersion(), dep.getScope(), dep.getType(), dep.getClassifier(),
project.getArtifact().getArtifactHandler());
}
(I tried almost all the other suggested approaches, however all of them ended up with some error or another. Now, looking back, I suspect many of those errors might have been due to the fact that my leaf POM was missing version numbers for some artifacts. It seems (acceptably so) the "model enrichment" phase - propagating parent versions etc. - is carried out by some earlier component in Maven's flow; and the caller has to take care of this, at least partially, when invoking the dependency resolver from scratch.)

Maven plugin artifact filter unimportant dependencies

I'm developing a Maven plugin and using the MavenProject object to access my dependencies with project.getDependencyArtifacts(), but this gives my all jar, even the test only jars.
Is there some method to filter all non runtime jar? If I just get the scope and compare for scope.equals("runtime") I will throw out the compile and other important dependencies.
I did not find an existing method for this either so I'm using the following logic. This is a plugin building a customized ear, which adds the needed dependencies to an xml file and include them in the archive. It is using getArtifacts instead of getDependencyArtifacts since I'm also interested in transitive dependencies.
Collection<Artifact> dependencies = new ArrayList<Artifact>();
dependencies.addAll(project.getArtifacts());
for (Iterator<Artifact> it=dependencies.iterator(); it.hasNext(); ) {
Artifact dependency = it.next();
String scope = dependency.getScope();
String type = dependency.getType();
if (dependency.isOptional() || !"jar".equals(type) || "provided".equals(scope) || "test".equals(scope) || "system".equals(scope)) {
getLog().debug("Pruning dependency " + dependency);
it.remove();
}
}

Resources