maven install became very slow after using "LATEST" and "RELEASE" version - maven

i'm using maven 2.2.1 in my daily work. recently i found my "maven install" become quite slow (about 3 times slower).
after doing many compile and debug, i found that its the lots of RELEASE and LATEST version dependencies(which were used to be specific numbers) that handicap the compile progress.
i know that "RELEASE" and "LATEST" are both deprecated functions in maven, and should not be used any more. but as adopting "RELEASE" and "LATEST" are something that i cannot change in my job. i can only try some luck with the code. and here they are:
public class DefaultMavenProjectBuilder extends AbstractLogEnabled
implements MavenProjectBuilder, Initializable, Contextualizable
{
private Map processedProjectCache = new HashMap();
public MavenProject buildFromRepository( Artifact artifact,...) {
String cacheKey = createCacheKey( artifact.getGroupId(), artifact.getArtifactId(), artifact.getVersion() ); // <========= the "get key"
MavenProject project = (MavenProject) processedProjectCache.get( cacheKey );
if ( project != null )
{
return project;
}
Model model = findModelFromRepository( artifact, remoteArtifactRepositories, localRepository, allowStubModel );
ProjectBuilderConfiguration config = new DefaultProjectBuilderConfiguration().setLocalRepository( localRepository );
MavenProject mavenProject = buildInternal("Artifact [" + artifact + "]", model, config, remoteArtifactRepositories,
null, false);
return mavenProject;
}
private MavenProject buildInternal(String pomLocation,
Model model,
ProjectBuilderConfiguration config,
List parentSearchRepositories,
File projectDescriptor,
boolean strict, String cacheKey)
throws ProjectBuildingException
{
//...
cacheKey = createCacheKey( project.getGroupId(), project.getArtifactId(), project.getVersion() ); // <========= the "set key"
processedProjectCache.put( cacheKey, project );
//...
}
}
i found that the main reason is buildInternal() been invoked 10 times more than before. and what caused that is, "set key" and "get key" became different, for example:
set key: com.foo:bar:RELEASE
get key: com.foo:bar:1.0.11
my plan is pass the "get key" into buildInternal, and used directly as the "set key", but not sure this won't break other things...
are there any suggestions?
thanks in advance :)

Related

Gradle how to change version number in source code

Java code:
public static String VERSION = "version_number";
Gradle build.gradle
version = '1.0'
How to set the version in java code from grade? The version must be in source code.
Is there a convenient way? A not-so-nice way:
copy the java file to another location, e.g. build/changed-source
change the version in the source, by replacing token
add the build/changed-source in main source set.
I'd do similar to Michael Easter but with these differences
Store generated sources separately from main sources (src/main/java and $buildDir/generated/java). This has the added benefit of not needing custom gitignore
Generate in a subdirectory of $buildDir so that clean task will delete the generated sources
Use a separate task for code generation with proper up-to-date & skip support
Use Copy.expand(Map) to do the token replacement
Since its directory based, everything in src/template/java will have tokens replaced. You can easily add more templates in future
src/template/java/com/foo/BuildInfo.java
package com.foo;
public class BuildInfo {
public static String getVersion() {
return "${version}";
}
}
build.gradle
task generateJava(type:Copy) {
def templateContext = [version: project.version]
inputs.properties templateContext // for gradle up-to-date check
from 'src/template/java'
into "$buildDir/generated/java"
expand templateContext
}
sourceSets.main.java.srcDir "$buildDir/generated/java" // add the extra source dir
compileJava.dependsOn generateJava // wire the generateJava task into the DAG
One method is to similar to your not-so-nice way, but slightly easier. Consider a file in templates/BuildInfo.java:
package __PACKAGE;
public class BuildInfo {
private static final String version = "__VERSION";
private static final String buildTimestamp = "__BUILD_TIMESTAMP";
public String toString() {
return "version : " + version + "\n" +
"build timestamp : " + buildTimestamp + "\n";
}
}
This file can then be "stamped" with information as first thing in the compileJava task and written to src/main/java/your/package/BuildInfo.java:
def targetPackage = 'net/codetojoy/util'
def targetPackageJava = 'net.codetojoy.util'
def appVersion = project.appVersion // from gradle.properties
def buildTimeStamp = new Date().toString()
compileJava {
doFirst {
ant.mkdir(dir: "${projectDir}/src/main/java/${targetPackage}")
def newBuildInfo = new File("${projectDir}/src/main/java/${targetPackage}/BuildInfo.java")
def templateBuildInfo = new File("${projectDir}/templates/TemplateBuildInfo.java")
newBuildInfo.withWriter { def writer ->
templateBuildInfo.eachLine { def line ->
def newLine = line.replace("__PACKAGE", targetPackageJava)
.replace("__VERSION", appVersion)
.replace("__BUILD_TIMESTAMP", buildTimeStamp)
writer.write(newLine + "\n");
}
}
}
}
A working example is provided here. Everything would be stored in source-control except the src/main/java/your/package/BuildInfo.java file. Note the version would be stored in gradle.properties.

How to use aether to get the latest timestamped snapshot dependency?

I am using aether to resolve dependencies. But in case of a snapshot dependencies where there are multiple timestamps for the same version, it does not honor the timestamps but picks the first available. How to read timestamps using aether?
I am using the following block of code to calculate both snapshot and release versions
public class GetDirectDependencies
{
public static void main( String[] args )
throws Exception
{
System.out.println( "------------------------------------------------------------" );
System.out.println( GetDirectDependencies.class.getSimpleName() );
RepositorySystem system = Booter.newRepositorySystem();
RepositorySystemSession session = Booter.newRepositorySystemSession( system );
Artifact artifact = new DefaultArtifact( "org.eclipse.aether:aether-impl:1.0.0.v20140518" );
ArtifactDescriptorRequest descriptorRequest = new ArtifactDescriptorRequest();
descriptorRequest.setArtifact( artifact );
descriptorRequest.setRepositories( Booter.newRepositories( system, session ) );
ArtifactDescriptorResult descriptorResult = system.readArtifactDescriptor( session, descriptorRequest );
for ( Dependency dependency : descriptorResult.getDependencies() )
{
System.out.println( dependency );
}
}
}
This is from their documentation https://github.com/eclipse/aether-demo/blob/master/aether-demo-snippets/src/main/java/org/eclipse/aether/examples/GetDirectDependencies.java
I achieved this by configuring the following session property
aether.artifactResolver.snapshotNormalization=false
Example:
mvn -Daether.artifactResolver.snapshotNormalization=false clean install

Hibernate flush optimization using `hibernate.ejb.use_class_enhancer`

I am trying to use the hibernate feature that enhances the flush performance without making code changes. I came across the option hibernate.ejb.use_class_enhancer.
I made the following changes.
1) enabled the property hibernate.ejb.use_class_enhancer to true.
Build failed with error 'Cannot apply class transformer without LoadTimeWeaver specified'
2) I added
context:load-time-weaver to the context files.
Build failed with the following error :
Specify a custom LoadTimeWeaver or start your Java virtual machine with Spring’s agent: -javaagent:spring-agent.jar
3) I added the following to the maven-surefire-plugin
javaagent:${settings.localRepository}/org/springframework/spring-
agent/2.5.6.SEC03/spring-agent-2.5.6.SEC03.jar
the build is successful now.
We have an interceptor that tracks the number of entities being flushed in a transaction.
After I did the above changes, I was expecting that number to come down significantly, but, they did not.
My question is:
Are the above changes correct/enough for getting the 'entity flush optimization'?
How to verify that the application is indeed using the optimization?
Edit:
After debugging, I found the following.
There is a time when our DO class is submitted for transformation, but, the logic that figures out whether a given class is supposed to be transformed is not handling the class names correctly (in my case), because of that, the DO class goes without being transformed.
Is there a way I can pass my logic instead ?
the relevant code is below.
The return copyEntities.contains( className ); is coming out false for the following inputs.
copyEntities contains list of strings "com.x.y.abcDO", "com.x.y.asxDO" where are the className is "com.x.y.abcDO_$$_jvsteb8_48"
public InterceptFieldClassFileTransformer(List<String> entities) {
final List<String> copyEntities = new ArrayList<String>( entities.size() );
copyEntities.addAll( entities );
classTransformer = Environment.getBytecodeProvider().getTransformer(
//TODO change it to a static class to make it faster?
new ClassFilter() {
public boolean shouldInstrumentClass(String clas sName) {
return copyEntities.contains( className );
}
},
//TODO change it to a static class to make it faster?
new FieldFilter() {
#Override
public boolean shouldInstrumentField(String clas sName, String fieldName) {
return true;
}
#Override
public boolean shouldTransformFieldAccess(
String transformingClassName, String fieldOwnerClassName, String fieldName
) {
return true;
}
}
);
}
edited on June 15th
I updated my project to use Spring 4.0.5.RELEASE and hibernate to 4.3.5.Final
I started using org.hibernate.jpa.HibernatePersistenceProvider
and
org.springframework.instrument.classloading.InstrumentationLoadTimeWeaver
and
hibernate.ejb.use_class_enhancer=true
with these changes, I am debugging the flush behavior. I have a question in this code block .
private boolean isUnequivocallyNonDirty(Object entity) {
if(entity instanceof SelfDirtinessTracker)
return ((SelfDirtinessTracker) entity).$$_hibernate_hasDirtyAttributes();
final CustomEntityDirtinessStrategy customEntityDirtinessStrategy =
persistenceContext.getSession().getFactory().getCustomEntityDirtinessStrategy();
if ( customEntityDirtinessStrategy.canDirtyCheck( entity, getPersister(), (Session) persistenceContext.getSession() ) ) {
return ! customEntityDirtinessStrategy.isDirty( entity, getPersister(), (Session) persistenceContext.getSession() );
}
if ( getPersister().hasMutableProperties() ) {
return false;
}
if ( getPersister().getInstrumentationMetadata().isInstrumented() ) {
// the entity must be instrumented (otherwise we cant check dirty flag) and the dirty flag is false
return ! getPersister().getInstrumentationMetadata().extractInterceptor( entity ).isDirty();
}
return false;
}
In my case, the flow is returning false because of persister saying yes for hasMutableProperties. I think the interceptor did not have a chance to answer at all.
Is it not that the bytecode transformer cause an interceptor here? Or the bytecode transform should make the entity a SelfDirtinessTracker?
Can anyone explain, what is the behavior I should expect here from the bytecode transformation here.

How to update the <latest> tag in maven-metadata-local when installing an artifact?

In the project I am working on we have various multi-module projects being developed in parallel, some of which are dependent on others. Because of this we are using using version ranges, e.g. [0.0.1,), for our internal dependencies during development so that we can always work against the latest snapshot versions. (I understand that this isn't considered best practice, but for now at least we are stuck with the current project structure.) We have build profiles set up so that when we perform a release all the version ranges get replaced with RELEASE to compile against the latest released version.
We have to use ranges as opposed to LATEST because when installing an artifact locally, the <latest> tag inside maven-metadata-local.xml is never updated, and so specifying LATEST will get the last version deployed to our Artifactory server. The problem with the ranges though is that the build process seems to have to download all the metadata files for all the versions of an artifact to be able to determine the latest version. As our project goes on we are accumulating more and more versions and artifacts so our builds are taking longer and longer. Specifying LATEST avoids this but means that changes from local artifact installs are generally not picked up.
Is there any way to get the <latest> tag in the maven-metadata-local.xml file to be updated when installing an artifact locally?
If you are working with SNAPSHOT's you don't need version ranges apart from that never use version ranges (only in extrem rare situtions). With version ranges your build is not reproducible which should be avoided in my opinion under any circumstance.
But you can use things like this:
<version>[1.2.3,)</version
but as you already realized that caused some problems, but I would suggest to use the versions-maven-plugin as an alternative to update the projects pom files accordingly.
mvn clean versions:use-latest-versions scm:checkin deploy -Dmessage="update versions" -DperformRelease=true
This can be handled by CI solution like Jenkins. But I got the impression that you are doing some basic things wrong. In particular if you need to use version ranges.
I had the same problem, so I wrote a maven plugin to handle it for me. It's a pretty extreme workaround, but it does work.
The documentation for creating maven plugins is on The Apache Maven Project. You could just create a plugin project from the command line archetype and add this mojo to your project.
/**
* Inserts a "latest" block into the maven-metadata-local.xml in the user's local
* repository using the currently configured version number.
*
* #version Sep 23, 2013
*/
#Mojo( name = "latest-version", defaultPhase = LifecyclePhase.INSTALL )
public class InstallLatestVersionMojo extends AbstractMojo {
/**
* Location of the .m2 directory
*/
#Parameter( defaultValue = "/${user.home}/.m2/repository", property = "outputDir", required = true )
private File repositoryLocation;
#Parameter( defaultValue = "${project.groupId}", property = "groupId", required = true )
private String groupId;
#Parameter( defaultValue = "${project.artifactId}", property = "artifactId", required = true )
private String artifactId;
/**
* Version to use as the installed version
*/
#Parameter( defaultValue = "${project.version}", property = "version", required = true )
private String version;
public void execute() throws MojoExecutionException, MojoFailureException {
try {
// Fetch the xml file to edit from the user's repository for the project
File installDirectory = getInstallDirectory(repositoryLocation, groupId, artifactId);
File xmlFile = new File(installDirectory, "maven-metadata-local.xml");
Document xml = getXmlDoc(xmlFile);
if (xml != null) {
// Fetch the <latest> node
Node nodeLatest = getNode(xml, "/metadata/versioning/latest");
if (nodeLatest == null) {
// If <latest> does not yet exist, insert it into the <versioning> block before <versions>
nodeLatest = xml.createElement("latest");
Node versioningNode = getNode(xml, "/metadata/versioning");
if (versioningNode != null) {
versioningNode.insertBefore(nodeLatest, getNode(xml, "metadata/versioning/versions"));
}
}
// set the version on the <latest> node to the newly installed version
nodeLatest.setTextContent(version);
// save the xml
save(xmlFile, xml);
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
private void save(File xmlFile, Document xml) throws TransformerFactoryConfigurationError, TransformerException {
Transformer transformer = TransformerFactory.newInstance().newTransformer();
transformer.setOutputProperty(OutputKeys.INDENT, "yes");
Result output = new StreamResult(xmlFile);
Source input = new DOMSource(xml);
transformer.transform(input, output);
}
private Node getNode(Document source, String path) throws XPathExpressionException{
Node ret = null;
XPathExpression xPath = getPath(path);
NodeList nodes = (NodeList) xPath.evaluate(source, XPathConstants.NODESET);
if(nodes.getLength() > 0 ) {
ret = nodes.item(0);
}
return ret;
}
private XPathExpression getPath(String path) throws XPathExpressionException{
XPath xpath = XPathFactory.newInstance().newXPath();
return xpath.compile(path);
}
private File getInstallDirectory(File repositoryLocation, String groupId, String artifactId) {
String group = groupId.replace('.', '/');
return new File(repositoryLocation, group + "/" + artifactId);
}
private Document getXmlDoc(File xmlFile) throws ParserConfigurationException, SAXException, IOException {
DocumentBuilderFactory dbFactory = DocumentBuilderFactory.newInstance();
DocumentBuilder dBuilder = dbFactory.newDocumentBuilder();
return dBuilder.parse(xmlFile);
}
}
How about defining those internal dependencies as modules in one reactor pom? That way you'll compile against the compiled sources (in target/classes) instead of against a jar, and you'll always have the latest code.

How to programmatically list all transitive dependencies, including overridden ones in Maven using DependencyGraphBuilder?

This is similar to other questions (like this), but I want to be able to do this with the latest API's. The maven-dependency-plugin:tree verbose option has been deprecated and does nothing in the latest (2.5.1) code, so there is no good example of how to do it.
I believe Aether utility class from jcabi-aether can help you to get a list of all dependencies of any Maven artifact, for example:
File repo = this.session.getLocalRepository().getBasedir();
Collection<Artifact> deps = new Aether(this.getProject(), repo).resolve(
new DefaultArtifact("junit", "junit-dep", "", "jar", "4.10"),
JavaScopes.RUNTIME
);
If you're outside of Maven plugin:
File repo = new File("/tmp/local-repository");
MavenProject project = new MavenProject();
project.setRemoteProjectRepositories(
Arrays.asList(
new RemoteRepository(
"maven-central",
"default",
"http://repo1.maven.org/maven2/"
)
)
);
Collection<Artifact> deps = new Aether(project, repo).resolve(
new DefaultArtifact("junit", "junit-dep", "", "jar", "4.10"),
"runtime"
);
The only dependency you need is:
<dependency>
<groupId>com.jcabi</groupId>
<artifactId>jcabi-aether</artifactId>
<version>0.7.5</version>
</dependency>
Including my approach here, as the additional steps may become part of your actual use case, esp. if working on a composite or multi-module project.
(Maven 3, my runtime was 3.6; no direct dependency on Aether)
In my case I wanted to resolve the dependency tree of a specific artifact foo-runtime from inside my plugin; however,
some of the dependency versions were only available in its parent's foo-parent POM (i.e. absent in the foo-runtime's own POM).
The parent POM also had additional details, such as exclusions for some of foo-runtime's dependencies - via dependencyManagement.
So I had to:
explicitly load the parent's model,
link the child model to it,
fill in the missing version numbers of the child model (still not sure why Maven didn't automatically resolve these after linking the parent), and then
run dependency resolution for the child.
To avoid model building from scratch, I derived the model of of foo-runtime using an existing artifact foo-api (which in my case is always guaranteed to be present in the Maven project being built). All these artifacts share the same groupId.
#Component
public LifecycleDependencyResolver resolver;
// ...
// `artifacts` contains all artifacts of current/reactor `MavenProject` obtained via `project.getArtifacts()`
private Set<Artifact> resolveRuntimeDeps(Set<Artifact> artifacts) throws MojoExecutionException {
// foo-api will always be present; use it to derive coordinates for foo-runtime
Artifact fooApi = artifacts.stream().filter(artifact -> "foo-api".equals(artifact.getArtifactId()))
.findFirst().orElseThrow(() -> new MojoExecutionException("Unable to find foo-api"));
Collection<String> scopes = Arrays.asList("compile", "runtime");
MavenProject fooRoot = deriveProject(fooApi, "foo-parent");
Model fooRootPom = fooRoot.getModel();
MavenProject fooSrv = deriveProject(fooApi, "foo-runtime");
fooSrv.setParent(fooRoot);
// some foo-runtime deps depend on versions declared on parent pom; merge them
Map<String, Artifact> depMgt = fooRootPom.getDependencyManagement().getDependencies().stream()
.collect(Collectors.toMap(dep -> dep.getGroupId() + ":" + dep.getArtifactId() + ":" + dep.getType(), this::toArtifact));
for (Dependency d : fooSrv.getDependencies()) {
if (d.getVersion() == null) {
Artifact managed = depMgt.get(d.getGroupId() + ":" + d.getArtifactId() + ":" + d.getType());
if (managed != null) {
d.setVersion(managed.getVersion());
}
}
}
try {
resolver.resolveProjectDependencies(fooSrv, scopes, scopes, session, false, Collections.emptySet());
return fooSrv.getArtifacts();
} catch (LifecycleExecutionException e) {
throw new MojoExecutionException("Error resolving foo-runtime dependencies", e);
}
}
// load POM for another artifact based on foo-api JAR available in current project
private MavenProject deriveProject(Artifact fooApi, String artifactId) throws MojoExecutionException {
Model pom;
String pomPath = fooApi.getFile().getAbsolutePath().replaceAll("foo-api", artifactId).replaceAll("\\.jar$", ".pom");
try (InputStream fooRootPomData = new FileInputStream(pomPath)) {
pom = new MavenXpp3Reader().read(fooRootPomData);
pom.setPomFile(new File(pomPath));
} catch (IOException | XmlPullParserException e) {
throw new MojoExecutionException("Error loading " + artifactId + " metadata", e);
}
// set these params to avoid skips/errors during resolution
MavenProject proj = new MavenProject(pom);
proj.setArtifact(toArtifact(pom));
proj.setArtifactFilter(Objects::nonNull);
proj.setRemoteArtifactRepositories(Collections.emptyList());
return proj;
}
private Artifact toArtifact(Model model) {
return new DefaultArtifact(
Optional.ofNullable(model.getGroupId()).orElseGet(() -> model.getParent().getGroupId()), model.getArtifactId(),
Optional.ofNullable(model.getVersion()).orElseGet(() -> model.getParent().getVersion()), "compile", model.getPackaging(), null,
project.getArtifact().getArtifactHandler());
}
private Artifact toArtifact(Dependency dep) {
return new DefaultArtifact(dep.getGroupId(), dep.getArtifactId(), dep.getVersion(), dep.getScope(), dep.getType(), dep.getClassifier(),
project.getArtifact().getArtifactHandler());
}
(I tried almost all the other suggested approaches, however all of them ended up with some error or another. Now, looking back, I suspect many of those errors might have been due to the fact that my leaf POM was missing version numbers for some artifacts. It seems (acceptably so) the "model enrichment" phase - propagating parent versions etc. - is carried out by some earlier component in Maven's flow; and the caller has to take care of this, at least partially, when invoking the dependency resolver from scratch.)

Resources