How to configure Typesafe Activator *a priori* to use an existing local Maven repository? - maven

(Not found in the Activator documentation)
It seems that it is possible to have Activator also use an existing local Maven repository by adding the following entry (in bold) in file build.sbt:
resolvers += Seq(
"Local Maven Repository" at "file://q:/repositories/maven",
"Sonatype OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots"
)
I am not sure it works but anyway, the problem with this approach is that the project structure must already have been created (and therefore a local repository created and automatically populated by downloads), hence my question : is it possible to tell Activator before it creates the project structure that it should use some local Maven repository ?
Thanks in advance for any hint.

Activator makes use of the sbt-launcher. You can use the sbt-launcher to control which repositories sbt makes use of by default for each project and for the launcher itself.
If you'd like to modify the activator launcher itself, unzip the jar file and take a look at the sbt/boot.properties file included. You can use the format outlined at sbt's launcher docs to add your local maven repository to the list.
A simpler option in the future (but not enabled in our current properties file) is the launcher's ability to have an override repository configuration file. See: Sbt's proxy configuration docs. This file would allow you to specify the repositories you wish activator to use by default. We disabled this to ensure the offline repository which activator uses is added by default. However, I'll open a ticket to re-enable this feature. That way, you should be able to just create a ~/.sbt/repositories file with the following contents:
[repositories]
activator-local: file://${activator.local.repository-${activator.home-${user.home}/.activator}/repository}, [organization]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/[artifact](-[classifier]).[ext]
local
maven-local
maven-central
typesafe-releases: http://typesafe.artifactoryonline.com/typesafe/releases
typesafe-ivy-releases: http://typesafe.artifactoryonline.com/typesafe/ivy-releases, [organization]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/[artifact](-[classifier]).[ext]
Note: the ~/.sbt/boot directory will always exist. This is created to ensure that no other process deletes jar files we use while running, so we copy these out of the local cache. If we didn't, you'd see some really fun error messages.

Related

How to create multiple Maven m2 repositories for single user

I am setting up 2 jenkins instances on same server. I would like to create 2 local maven repositories for both jenkins seprately. Jenkins 1 is already set up and operational and I woudnt like to touch it.
Can we have 2 local maven repositories for single user as both jenkins are running as the same user ?
Is there any way I can point maven from jenkins to the new repository ?
Thanks
AI
Thanks for your replies, I figured it out, every maven command I am running, I am adding extra parameter
-Dmaven.repo.local=/path/to/alternate/local/repository/
and it overrides default local maven repository.
Cheers,
AI
You can specify a local repository in the settings.xml under <settings><localRepository>.
Alternatively, you can also specify it on the command line as described here:
https://stackoverflow.com/a/7071791/927493
Maven configuration occurs at 3 levels:
◾ Project - most static configuration occurs in pom.xml
◾ Installation - this is configuration added once for a Maven installation
◾ User - this is configuration specific to a particular user
Maven reads the settings from the settings.xml file which can be located in ${M2_HOME}/conf/settings.xml, as well as ${user.home}/.m2/settings.xml
<settings>
...
<localRepository>/path/to/local/repo/</localRepository>
...
</settings>
The default value for the local maven repository is ${user.home}/.m2/repository/
Even though you are are constrained in that you are using the same use for both Jenkins, remember the jobs run on nodes and those can be launched by different users (if via SSH), but it's that node's user's ${user.home}, not the user running Jenkins itself which is read (See Node note below).
What's the simplest way to get two different <localRepository> for a given user?
Jenkins has a Global Tool Configuration (${JENKINS_URL}/configureTools) for Maven:
Further down the the same page you must configure the Maven installations
You could choose the Global Tool Configuration | Maven Configuration to NOT use the default maven settings (the two mentioned above) and specify one from the filesystem:
One one instance, choose ${user.home}/.m2/settings.J1.xml, and on the other, choose ${user.home}/.m2/settings.J2.xml
Alternatively, you could even choose two different "Maven installations", with a different MAVEN_HOME, then have a different ${M2_HOME}/conf/settings.xml in each (awkward, but sometimes useful).
JOB SPECIFIC REPO?
But, if disk space is not really an issue, you could go a step further and give every single job its own private local repository. This is especially handy when building shared libraries in parallel, parallel branches and other scenarios using -SNAPSHOT. Under the Advanced ... options in the maven step, select [ X ] use private Maven repository. That repository ends up residing in ${WORKSPACE}/.repository. Suggest adding the Workspace Cleanup plugin to help manage your space.
You can also, on a per-job basis, specify a specific file-system settings.xml.
NODE SPECIFIC REPO?
Also, each Node has its own Node Properties which again let you customize the "Tool Locations". You can override the "(Maven) Home" location here (but not settings location).
[
PIPELINE
All this is also supported and configurable if using a Pipeline, as described in the Pipeline Maven Integration Plugin.
Extra Plugins
Finally, if you don't like the idea of having all these settings.xml lying all over the filesystems, you can install the Config File Provider plugin which lets you store the custom settings.xml within Jenkins. After installation, the Global Tool Configuration (shown) and Job steps now has the added option to choose from the "provided settings.xml" options you create:

maven repositories and mirrors and command line options

I'm having an issue where an external tool is being used to make a call which causes mvn to download a dependency on the fly. This download however is calling the "central" enterprise artifactory repo rather than one of our normal artifactory repos and I'm trying to figure out how to make it mirror the enterprise repo to point to the appropriate repo.
All I've seen indicates I should be able to do this by setting the mirror in the settings.xml file, and I've passed the path to this settings file via the -s option.
But the mirror is still being ignored.
Is there something special about making a command to use a dependency via the commandline that bypasses mirrors?
It appears that the reason setting mirrors wasn't working was because the deployment mechanisms in place weren't actually setting the xml files as intended. To get around it we added code to modify the .m2 folder to contain the xml files as part of the script run during deployment.

Kafka Connector - Packacking jars

QUESTION
I am not a maven pro and I got stuck trying to package a Kafka Connector. There are two options for packaging it:
Either you produce a folder with a jar that contains the connector + all the dependency jars - all the kafka-specific jars
Or build a fat jar with all of the dependencies (and I also assume without the kafka-specific jars again, but it is not explicit in the docs).
I am following docs on confluent webpage and the connector I am trying to package is this one on github.
What I tried, after cloning the repo with git, is the following mvn clean package. But this seems to create only a single jar of the original project with the dependencies in mvn cache (~/.m2/repository/).
Google also has link on how to create a fat jar, but I would need somehow to specify which jars I want to exclude from the fat jar.
Thanks
UPDATE
Now I am running:
connect-standalone /etc/kafka/connect-standalone.properties /etc/kafka/connect-cdc-mssql-source.properties
Where /etc/kafka/connect-standalone.properties contains the following line:
plugin.path=/shared_win_files
And ls -al /shared_win_files contains the following:
kafka-connect-cdc-mssql-0.0.1-SNAPSHOT.jar
And jar tvf kafka-connect-cdc-mssql-0.0.1-SNAPSHOT.jar contains the following:
6996 Thu Sep 07 14:47:24 BST 2017 com/github/jcustenborder/kafka/connect/cdc/mssql/MsSqlSourceConnector.class
where MsSqlSourceConnector.classis basically this class here which implements the Connector.
But when I try to run the connector with the command above, I get an error
Failed to find any class that implements Connector and which name matches com.github.jcustenborder.kafka.connect.cdc.mssql.MsSqlSourceConnector
It gives a massive list with all available plugins, but mine is not in there.
Currently, an easy way to package your connector with maven is to use maven-assembly-plugin. This basically entails two main steps:
Define one or more assembly descriptors and save them under src/assembly.
Doc: http://maven.apache.org/plugins/maven-assembly-plugin/descriptor-refs.html
Example: https://github.com/confluentinc/kafka-connect-elasticsearch/blob/master/src/assembly/package.xml
In the descriptor, among other things, you may choose the packaging format of your archive, files and directories to include or exclude, as well as specific settings regarding your projects dependencies.
Include the plugin in your project's pom.xml
Example: https://github.com/confluentinc/kafka-connect-elasticsearch/blob/master/pom.xml
This mainly requires you to define the configuration and execution sections of the maven-assembly-plugin section. Additionally you can associate calls to specific assembly plugin descriptors with certain maven profiles that you may define.
Finally, stay tuned because packaging your Kafka Connect plugins (connectors, transforms, converters) might be significantly simplified soon.
Following Konstantine's answer how to package jars.
The remaining problem was that when specifying plugin.path=/abc in KafkaConnect config, you have to be careful.
You can either put a fat jar like this:
/abc/fatjar.jar
Or you have to create another folder in abc and put all the related jars into that folder like this:
/abc/my-connector-a/connector.jar
/abc/my-connector-a/connector-dependency.jar
...
As in my case, it was treating jars as separate plugins.

bndtools Activator bundle

How can I create a simple bundle with an Activator in bndtools?
It keeps saying that:
The JAR is empty: The instructions for the JAR named com.myproj did not cause any content to be included, this is likely wrong bnd.bnd /com.myproj Unknown Bndtools Problem Marker
Unused Private-Package instructions, no such package(s) on the class path: [com.myproj] bnd.bnd /com.myproj Unknown Bndtools Problem Marker
The way I create this project in Eclipse is:
Create new "Bndtools OSGi project"
Right click, configure - Convert to Maven project
Create Activator.java in package com.myproj.
Add com.myproj to private packages
Set activator to com.Activator
Here is my bnd file:
Bundle-Activator: com.myproj.Activator
Private-Package: com.myproj
My generated jar is empty. Any tips?
P.S.: Here is my eclipse project (exported as a zip-archive) in case it sheds any light on things: https://dl.dropbox.com/u/9162958/scraper.zip
My guess is that "Convert to Maven project" is the trouble. This likely has changed the Eclipse classpath for the project from the bnd default bin folder to 'target/classes'. Can you confirm that it works without converting to maven?
bnd can work with other places for the bin folder, you must set the ${bin} property (preferably in cnf/build.bnd). There are some writeups how to use bndtools with maven. The reason that bnd does not follow Eclipse's settings here is that they are not available without Eclipse and a design goal of bnd is that it builds anywhere: the bnd file must therefore be the final arbiter of information.
Anyway one more tip ... activators are not the right way to build OSGi builds since they are an evil singleton. Declarative services is far superior and we should actually have used a similar mechanism when we designed OSGi.
My setup:
Eclipse Luna 4.4.0 (20140612-0600)
Bndtools 2.3.0.REL-20140510-023245
Here is how I made it work:
I downloaded you exported scraper.zip.
Created and empty workspace in Eclipse.
Imported your project from the ZIP archive into the empty workspace.
The default cnf project was automatically created.
By default, bnd is configured to use the bin directory for compiled *.class files while your Eclipse project is configured to use target/classes. Therefore, I had to change this settings in cnf/build.bnd by adding a single line:
########################
## BND BUILD SETTINGS ##
########################
bin: target/classes
Now, after cleaning and rebuilding the project, bnd creates generated/scraper.jar that contains your Activator.class.
Notes:
You could also adapt your project configuration to use the bin directory instead of target/classes but I assume that you will use Maven later on.
When using bndtools, it sometimes helps when you start with an empty workspace and import your projects one-by-one.
There is a bug in bndtools 2.4 which causes some problems if there are multiple source directories per project. Therefore, I'm still using version 2.3

sbt 0.11: Using a corporate maven repository

How can a corporate Maven repository be used (to the exclusion of other repositories) with sbt 0.11.x, as described in how do I get sbt to use a local maven proxy repository (Nexus)? ? There is no mention of ivyRepositories in the new sbt wiki at github, so I'm assuming the accepted solution there is out of date.
Step 1: Follow the instructions at Detailed Topics: Proxy Repositories, which I have summarised and added to below:
(If you are using Artifactory, you can skip this step.) Create an entirely separate Maven proxy repository (or group) on your corporate Maven repository, to proxy ivy-style repositories such as these two important ones:
http://repo.typesafe.com/typesafe/ivy-releases/
http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/
This is needed because some repository managers cannot handle Ivy-style and Maven-style repositories being mixed together.
Create a file repositories, listing both your main corporate repository and any extra one that you created in step 1, in the format shown below:
[repositories]
my-maven-proxy-releases: http://repo.example.com/maven-releases/
my-ivy-proxy-releases: http://repo.example.com/ivy-releases/, [organization]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/[artifact](-[classifier]).[ext]
Either save that file in the .sbt directory inside your home directory, or specify it on the sbt command line (you will need to specify if you have disabled sharing):
sbt -Dsbt.repository.config=<path-to-your-repo-file>
Good news for those using older versions of sbt: Even though, in the sbt 0.12.0 launcher jar at least, the boot properties files for older sbt versions don't contain the required line (the one that mentions repository.config), it will still work for those versions of sbt if you edit those files to add the required line, and repackage them into the sbt 0.12.0 launcher jar! This is because the feature is implemented in the launcher, not in sbt itself. And the sbt 0.12.0 launcher is claimed to be able to launch all versions of sbt, right back to 0.7!
Step 2: To make sure external repositories are not being used, remove the default repositories from your resolvers. This can be done in one of three ways:
Add the command line option -Dsbt.override.build.repos=true mentioned on the Detailed Topics page above. This will cause the repositories you specified in the file to override any repositories specified in any of your sbt files. This might only work in sbt 0.12 and above, though - I haven't tried it yet.
Having the same effect as 1, you can use overrideBuildResolvers := true, with the advantage that you can control the projects where it is applicable, depending on which scope (a project / ThisBuild / Global) you define it in. This works in sbt 0.13.
Use fullResolvers := Seq( resolver(s) for your corporate maven repositories ) in your build files, instead of resolvers ++= or resolvers := or whatever you used to use.
Finally, note that the sbt launcher script has a bug in reading the sbtopts file, so if you decide to put your common sbt command-line options in there, make sure the last line of the file ends in a newline (Emacs in particular can fail to ensure this, unless configured to do so).
An alternative for Step 2 of the accepted answer (am using sbt 0.13.1):
Add file .sbtopts to the project root directory with contents:
-Dsbt.override.build.repos=true
Another alternative is to add this line in $SBT_HOME/conf/.sbtopts, but this would force the setting for all projects.
Unpack the sbt-launcher.jar and copy the sbt.boot.properties file to a location of your choice. Change the launch script to use this file. In the file, change the repositories section to only contain your local repo and the corporate one. The distinction between Maven and Ivy comes from the given pattern (no pattern means Maven pattern by default).
Here is an example:
[repositories]
local
corporate: http://inhouse.acme.com/releases/

Resources