And in multiple file conditions does not work on Maven profile - maven

I'm using Maven 3.5+ that I've read that Maven 3.2.2+ supports And condition in activation of profiles. So I've added multiple condition in activation tag of a profile as below:
<activation>
<file>
<exists>${basedir}/src/main/resources/static/index.html</exists>
<missing>${basedir}/src/main/resources/static/app/gen-src/metadata.json</missing>
</file>
</activation>
And I put it in parent's pom.xml. and the profile should execute when the child project contains index.html but does not have metadata.json.
When I compile the child project that has both index.html and metadata.json, the profile activated and plugins going to execute. but the profile should not active in this situation. I think conditions ORed by maven.

Looking at the v3.5.0 ActivationFile javadoc (couldn't find the source yet) and FileProfileActivator sources, currently that does not seem to be possible with multiple files, and there's this issue open.
The file-activation-configuration accepts 2 parameters, one for existing and one for missing file. So both the parameters affect the same configuration, and you can only have one such configuration.
As a result, it will look for either an existing or a missing file, in this order if both values are set, but not for the both of them. Unfortunately I couldn't find a work-around so far...
1) ActivationFile javadoc:
public class ActivationFile
extends Object
implements Serializable, Cloneable, InputLocationTracker
This is the file specification used to activate the profile. The missing value is the location of a file that needs to exist, and if it doesn't, the profile will be activated. On the other hand, exists will test for the existence of the file and if it is there, the profile will be activated.
Variable interpolation for these file specifications is limited to ${basedir}, System properties and request properties.
2) FileProfileActivator sources (please note that I've omitted some interpolation code for the sake of brevity)
#Override
public boolean isActive(Profile profile, ProfileActivationContext context, ModelProblemCollector problems) {
Activation activation = profile.getActivation();
if (activation == null) {
return false;
}
ActivationFile file = activation.getFile();
if (file == null) {
return false;
}
String path;
boolean missing;
if (StringUtils.isNotEmpty(file.getExists())) {
path = file.getExists();
missing = false;
} else if (StringUtils.isNotEmpty(file.getMissing())) {
path = file.getMissing();
missing = true;
} else {
return false;
}
/* ===> interpolation code omitted for the sake of brevity <=== */
// replace activation value with interpolated value
if (missing) {
file.setMissing(path);
} else {
file.setExists(path);
}
File f = new File(path);
if (!f.isAbsolute()) {
return false;
}
boolean fileExists = f.exists();
return missing ? !fileExists : fileExists;
}

Related

Specifying rootPath for Liquibase via jOOQ generation

I'm trying to utilize jOOQ's ability to generate from Liquibase files. My file structure is as follows:
C
- dev
-- testproject
--- src/main/resources
---- db
----- changelog.xml
In order to reference this file from the jOOQ configuration, I have the following in my build.gradle.kts:
jooq {
configurations {
create("main") {
jooqConfiguration.apply {
generator.apply {
database.apply {
name = "org.jooq.meta.extensions.liquibase.LiquibaseDatabase"
properties.add(Property().apply {
key = "rootPath"
value = "C:/dev/testproject/src/main/resources/db/"
})
properties.add(Property().apply {
key = "scripts"
value = "changelog.xml"
})
}
}
}
}
}
}
I'm using plugin version 7.1.1 and have the following dependencies:
dependencies {
implementation("org.liquibase:liquibase-core:4.8.0") // I tried removing this, no change
jooqGenerator("org.postgresql:postgresql:42.3.2")
jooqGenerator("org.jooq:jooq-meta-extensions-liquibase:3.17.2")
jooqGenerator(files("src/main/resources")) // I don't think this is necessary
}
When I try to run jooqGenerate, the error I get is:
Caused by: liquibase.exception.ChangeLogParseException: The file changelog.xml was not found in
Specifying files by absolute path was removed in Liquibase 4.0. Please use a relative path or add '/' to the classpath parameter.
at liquibase.parser.core.xml.XMLChangeLogSAXParser.parseToNode(XMLChangeLogSAXParser.java:82)
at liquibase.parser.core.xml.AbstractChangeLogParser.parse(AbstractChangeLogParser.java:15)
at liquibase.Liquibase.getDatabaseChangeLog(Liquibase.java:369)
Notice how it doesn't say which directories it looked in. As far as I can tell, the resource accessor is not receiving the rootPath from the configuration. The relevant output from Liquibase is here. Again, it should say it looked in the rootPath, but it doesn't print anything else, so there must be no directories searched.
Not sure if this is helpful, but the jOOQ configuration file in build/tmp/generateJooq definitely has the rootPath:
<property>
<key>rootPath</key>
<value>C:/dev/testproject/src/main/resources/db/</value>
</property>
I'm not sure where I'm going wrong. I've also tried the following values of scripts without setting rootPath and seen the same behavior:
C:/dev/testproject/src/main/resources/db/changelog.xml
src/main/resources/db/changelog.xml
/src/main/resources/db/changelog.xml
classpath:src/main/resources/db/changelog.xml
classpath:/src/main/resources/db/changelog.xml
This was causing the problem (or rather, the confusion):
jooqGenerator(files("src/main/resources"))
Apparently, this sets the classpath of the jooqGenerator task to be src/main/resources! So, knowing that, I fixed my configuration to look like this:
database.apply {
name = "org.jooq.meta.extensions.liquibase.LiquibaseDatabase"
properties.add(Property().apply {
key = "scripts"
value = "classpath:db/changelog.xml"
})
}
Everything is working nicely now.

How to update the <latest> tag in maven-metadata-local when installing an artifact?

In the project I am working on we have various multi-module projects being developed in parallel, some of which are dependent on others. Because of this we are using using version ranges, e.g. [0.0.1,), for our internal dependencies during development so that we can always work against the latest snapshot versions. (I understand that this isn't considered best practice, but for now at least we are stuck with the current project structure.) We have build profiles set up so that when we perform a release all the version ranges get replaced with RELEASE to compile against the latest released version.
We have to use ranges as opposed to LATEST because when installing an artifact locally, the <latest> tag inside maven-metadata-local.xml is never updated, and so specifying LATEST will get the last version deployed to our Artifactory server. The problem with the ranges though is that the build process seems to have to download all the metadata files for all the versions of an artifact to be able to determine the latest version. As our project goes on we are accumulating more and more versions and artifacts so our builds are taking longer and longer. Specifying LATEST avoids this but means that changes from local artifact installs are generally not picked up.
Is there any way to get the <latest> tag in the maven-metadata-local.xml file to be updated when installing an artifact locally?
If you are working with SNAPSHOT's you don't need version ranges apart from that never use version ranges (only in extrem rare situtions). With version ranges your build is not reproducible which should be avoided in my opinion under any circumstance.
But you can use things like this:
<version>[1.2.3,)</version
but as you already realized that caused some problems, but I would suggest to use the versions-maven-plugin as an alternative to update the projects pom files accordingly.
mvn clean versions:use-latest-versions scm:checkin deploy -Dmessage="update versions" -DperformRelease=true
This can be handled by CI solution like Jenkins. But I got the impression that you are doing some basic things wrong. In particular if you need to use version ranges.
I had the same problem, so I wrote a maven plugin to handle it for me. It's a pretty extreme workaround, but it does work.
The documentation for creating maven plugins is on The Apache Maven Project. You could just create a plugin project from the command line archetype and add this mojo to your project.
/**
* Inserts a "latest" block into the maven-metadata-local.xml in the user's local
* repository using the currently configured version number.
*
* #version Sep 23, 2013
*/
#Mojo( name = "latest-version", defaultPhase = LifecyclePhase.INSTALL )
public class InstallLatestVersionMojo extends AbstractMojo {
/**
* Location of the .m2 directory
*/
#Parameter( defaultValue = "/${user.home}/.m2/repository", property = "outputDir", required = true )
private File repositoryLocation;
#Parameter( defaultValue = "${project.groupId}", property = "groupId", required = true )
private String groupId;
#Parameter( defaultValue = "${project.artifactId}", property = "artifactId", required = true )
private String artifactId;
/**
* Version to use as the installed version
*/
#Parameter( defaultValue = "${project.version}", property = "version", required = true )
private String version;
public void execute() throws MojoExecutionException, MojoFailureException {
try {
// Fetch the xml file to edit from the user's repository for the project
File installDirectory = getInstallDirectory(repositoryLocation, groupId, artifactId);
File xmlFile = new File(installDirectory, "maven-metadata-local.xml");
Document xml = getXmlDoc(xmlFile);
if (xml != null) {
// Fetch the <latest> node
Node nodeLatest = getNode(xml, "/metadata/versioning/latest");
if (nodeLatest == null) {
// If <latest> does not yet exist, insert it into the <versioning> block before <versions>
nodeLatest = xml.createElement("latest");
Node versioningNode = getNode(xml, "/metadata/versioning");
if (versioningNode != null) {
versioningNode.insertBefore(nodeLatest, getNode(xml, "metadata/versioning/versions"));
}
}
// set the version on the <latest> node to the newly installed version
nodeLatest.setTextContent(version);
// save the xml
save(xmlFile, xml);
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
private void save(File xmlFile, Document xml) throws TransformerFactoryConfigurationError, TransformerException {
Transformer transformer = TransformerFactory.newInstance().newTransformer();
transformer.setOutputProperty(OutputKeys.INDENT, "yes");
Result output = new StreamResult(xmlFile);
Source input = new DOMSource(xml);
transformer.transform(input, output);
}
private Node getNode(Document source, String path) throws XPathExpressionException{
Node ret = null;
XPathExpression xPath = getPath(path);
NodeList nodes = (NodeList) xPath.evaluate(source, XPathConstants.NODESET);
if(nodes.getLength() > 0 ) {
ret = nodes.item(0);
}
return ret;
}
private XPathExpression getPath(String path) throws XPathExpressionException{
XPath xpath = XPathFactory.newInstance().newXPath();
return xpath.compile(path);
}
private File getInstallDirectory(File repositoryLocation, String groupId, String artifactId) {
String group = groupId.replace('.', '/');
return new File(repositoryLocation, group + "/" + artifactId);
}
private Document getXmlDoc(File xmlFile) throws ParserConfigurationException, SAXException, IOException {
DocumentBuilderFactory dbFactory = DocumentBuilderFactory.newInstance();
DocumentBuilder dBuilder = dbFactory.newDocumentBuilder();
return dBuilder.parse(xmlFile);
}
}
How about defining those internal dependencies as modules in one reactor pom? That way you'll compile against the compiled sources (in target/classes) instead of against a jar, and you'll always have the latest code.

How can I provide custom logic in a Maven archetype?

I'm interested in creating a Maven archetype, and I think I have most of the basics down. However, one thing I'm stuck on is that sometimes I want to use custom logic to fill in a template. For example, if somebody generates my archetype and specifies the artifactId as hello-world, I'd like to generate a class named HelloWorld that simply prints out "Hello World!" to the console. If another person generates it with artifactId = howdy-there, the genned class would be HowdyThere and it would print out "Howdy There!".
I know that under the covers, Maven's archetype mechanism leverages the Velocity Template Engine, so I read this article on creating custom directives. This seemed to be what I was looking for, so I created a class called HyphenatedToCamelCaseDirective that extends org.apache.velocity.runtime.directive.Directive. In that class, my getName() implementation returns "hyphenatedCamelCase". In my archetype-metadata.xml file, I have the following...
<requiredProperties>
<requiredProperty key="userdirective">
<defaultValue>com.jlarge.HyphenatedToCamelCaseDirective</defaultValue>
</requiredProperty>
</requiredProperties>
My template class looks like this...
package ${package};
public class #hyphenatedToCamelCase('$artifactId') {
// userdirective = $userdirective
public static void main(String[] args) {
System.out.println("#hyphenatedToCamelCase('$artifactId')"));
}
}
After I install my archetype and then do an archetype:generate by specifying artifactId = howdy-there and groupId = f1.f2, the resulting class looks like this...
package f1.f2;
public class #hyphenatedToCamelCase('howdy-there') {
// userdirective = com.jlarge.HyphenatedToCamelCaseDirective
public static void main(String[] args) {
System.out.println("#hyphenatedToCamelCase('howdy-there')"));
}
}
The result shows that even though userdirective is being set the way I expected it to, It's not evaulating the #hyphenatedToCamelCase directives like I was hoping. In the directive class, I have the render method logging a message to System.out, but that message doesn't show up in the console, so that leads me to believe that the method never got executed during archetype:generate.
Am I missing something simple here, or is this approach just not the way to go?
The required properties section of the archetype-metatadata xml is used to pass additional properties to the velocity context, it is not meant for passing velocity engine configuration. So setting a property called userDirective will only make the variable $userDirective availble and not add a custom directive to the velocity engine.
If you see the source code, the velocity engine used by maven-archetype plugin does not depend on any external property source for its configuration. The code that generates the project relies on an autowired (by the plexus container) implementation of VelocityComponent.
This is the code where the velocity engine is initialized:
public void initialize()
throws InitializationException
{
engine = new VelocityEngine();
// avoid "unable to find resource 'VM_global_library.vm' in any resource loader."
engine.setProperty( "velocimacro.library", "" );
engine.setProperty( RuntimeConstants.RUNTIME_LOG_LOGSYSTEM, this );
if ( properties != null )
{
for ( Enumeration e = properties.propertyNames(); e.hasMoreElements(); )
{
String key = e.nextElement().toString();
String value = properties.getProperty( key );
engine.setProperty( key, value );
getLogger().debug( "Setting property: " + key + " => '" + value + "'." );
}
}
try
{
engine.init();
}
catch ( Exception e )
{
throw new InitializationException( "Cannot start the velocity engine: ", e );
}
}
There is a hacky way of adding your custom directive. The properties you see above are read from the components.xml file in the plexus-velocity-1.1.8.jar. So open this file and add your configuration property
<component-set>
<components>
<component>
<role>org.codehaus.plexus.velocity.VelocityComponent</role>
<role-hint>default</role-hint>
<implementation>org.codehaus.plexus.velocity.DefaultVelocityComponent</implementation>
<configuration>
<properties>
<property>
<name>resource.loader</name>
<value>classpath,site</value>
</property>
...
<property>
<name>userdirective</name>
<value>com.jlarge.HyphenatedToCamelCaseDirective</value>
</property>
</properties>
</configuration>
</component>
</components>
</component-set>
Next add your custom directive class file to this jar and run archetype:generate.
As you see this is very fraglie and you will need to figure a way to distribute this hacked plexus-velocity jar. Depending on what you are planning to use this archetype for it might be worth the effort.

Environment-specific web.xml in grails?

What's the best way to build environment-specific web.xml entries in grails?
I need to make certain modifications for production only, as they break running locally.
Any thoughts?
You can create scripts/_Events.groovy with an event handler for the 'WebXmlEnd' event which is fired once Grails and the plugins have finished making their changes. Update the XML with plain search/replace or via DOM methods by parsing the XML and write out the updated file:
import grails.util.Environment
eventWebXmlEnd = { String filename ->
if (Environment.current != Environment.PRODUCTION) {
return
}
String content = webXmlFile.text
// update the XML
content = ...
webXmlFile.withWriter { file -> file << content }
}
Here's the solution that's i'm using, from the guy over at death-head.ch
first install the templates
grails install-templates
then customize the web.xml you'll find in src/templates/war/web.xml. I chose to make a web_dev.xml and a web_prod.xml and delete the web.xml. I wanted web_prod.xml to contain a security-constraint block. anyway...
Place the following in BuildConfig.groovy:
// #########################################################
// ## Can't use environment switching block because BuildConfig doesn't support it.
// ## #url http://jira.grails.org/browse/GRAILS-4260
// ## So use this workaround:
// ## #url http://death-head.ch/blog/2010/09/finally-solved-the-base-authentication-in-grails/
// #########################################################
switch ("${System.getProperty('grails.env')}") {
case "development":
if (new File("/${basedir}/src/templates/war/web_dev.xml").exists()) {
grails.config.base.webXml = "file:${basedir}/src/templates/war/web_dev.xml"
}
break;
default:
if (new File("/${basedir}/src/templates/war/web_prod.xml").exists()) {
grails.config.base.webXml = "file:${basedir}/src/templates/war/web_prod.xml"
}
break;
}
Good luck!
I've never tried it, but it should be possible to specify the grails.config.base.webXml parameter in BuildConfig.groovy dependant on the current environment.
There's a list of available BuildConfig settings here
EDIT
Actually, due to this issue, this isn't a way forward :-( Maybe passing the property like:
grails -Dgrails.config.base.webXml=/path/to/web.xml
Is all that's possible?

How to discovering types exported by OSGi bundle without installing/activating?

Basically i want to discover if a jar implements any number of interfaces wihtout activating or starting the bundle. Is it possible to read the meta data from the meta-inf from an API just like the container does but without activating a bundle ?
I want to use OSGi to support plugins of which numerous interfaces will be published and i would like to know which interfaces are implemented by a bundle when the user uploads without activating the bundle etc.
I do not think it is possible to discover what services a bundle is going to provide, because this can happen from inside the Java code, without any meta-data about it. Of course, if you use Declarative Services, there is a meta-data file. Also, the bundle needs to import (or provide) the service interface, which may give you a hint (but not more).
You can inspect what Java packages a bundles imports and exports without activating it.
If you are willing to install (not resolve, not activate) it, you can query it. The Felix or Equinox shells can list those packages after all.
Here is the relevant source from Felix' shell. It uses the PackageAdmin service:
public void execute(String s, PrintStream out, PrintStream err)
{
// Get package admin service.
ServiceReference ref = m_context.getServiceReference(
org.osgi.service.packageadmin.PackageAdmin.class.getName());
PackageAdmin pa = (ref == null) ? null :
(PackageAdmin) m_context.getService(ref);
// ...
Bundle bundle = m_context.getBundle( bundleId );
ExportedPackage[] exports = pa.getExportedPackages(bundle);
// ...
}
you may try something like below. Find the ".class" files in the exported packages using bundle.findResource(...) method.
BundleContext context = bundle.getBundleContext();
ServiceReference ref = context.getServiceReference(PackageAdmin.class.getName());
PackageAdmin packageAdmin = (PackageAdmin)context.getService(ref);
List<Class> agentClasses = new ArrayList<Class>();
ExportedPackage[] exportedPackages = packageAdmin.getExportedPackages(bundle);
for(ExportedPackage ePackage : exportedPackages){
String packageName = ePackage.getName();
String packagePath = "/"+packageName.replace('.', '/');
//find all the class files in current exported package
Enumeration clazzes = bundle.findEntries(packagePath, "*.class", false);
while(clazzes.hasMoreElements()){
URL url = (URL)clazzes.nextElement();
String path = url.getPath();
int index = path.lastIndexOf("/");
int endIndex = path.length()-6;//Strip ".class" substring
String className = path.substring(index+1, endIndex);
String fullClassName=packageName+"."+className;
try {
Class clazz = bundle.loadClass(fullClassName);
//check whether the class is annotated with Agent tag.
if(clazz.isAnnotationPresent(Agent.class))
agentClasses.add(clazz);
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
}
}

Resources