Fast way to rebuild and reload opendaylight bundle - opendaylight

I am developing a bundle for opendaylight. Is there a quick way to just rebuild the bundle without the complete karaf environment?
Thanks in advance!
Max

You can rebuild the bundle, without anything else, by running Maven in the directory containing your bundle’s POM. If you’re on the command-line:
cd your-bundle-directory
mvn compile
(or mvn install etc. depending on what you want to do exactly).

Related

Can I split dependency resolution in maven into separate command?

I want to split a mvn install command into 2 separate ones - dependency resolution and build itself, like it can be done in NPM (with npm run ci and npm run webpack separately). The reason to do so is to measure how much time is spend on dependency resolution and how much on build itself.
I have tried to use mvn dependency:go-offline with mvn install -o afterwards. According to docs this is exactly what I need, however it does not work. Plugin dependencies are not downloaded (mentioned in pom file under build/plugins).
Can this goal be achieved somehow?
Indeed, it seems the dependency:go-offline standard goal is not "perfect" as some plugins may still trigger some (re)downloading during the mvn package phase.
(I had noticed this when looking at this SO question about how to optimally rely on Docker's cache.)
To fix this, you can try mvn de.qaware.maven:go-offline-maven-plugin:resolve-dependencies after installing the corresponding plugin mentioned in this SO answer by #user2813807.

Can't force Apache Karaf to load a new version of the bundle (version w/o "-SNAPSHOT")

I've tried to update bundle in Karaf without changing its version. Version doesn't contain "-SNAPSHOT".
Is there a way to do it?
I've installed bundle to Apache Karaf through the mvn link
The code inside this bundle was changed without changing its (bundle's) version (according to some inner reasons).
Then I removed this bundle from Karaf, and from Nexus.
Next step was deploying new bundle to Nexus.
When I finally tried to install the "new" bundle, I saw there were no changes in it.
It looks like there is some cache inside Karaf where he stores bundles, that was already installed.
Adding flags like -c or -cc doesn't help.

Compile ODL controller

I am trying to follow this example but I found one problem. I am trying to compile ODL controller but the files structure have changed compared to the previous versions and I don't know in what path I have to be to compile the controller.
I am following
git clone https://git.opendaylight.org/gerrit/p/controller.git
Check that the used Yang tools version is >= 0.5.8-SNAPSHOT.
But I have 0.8.0 (downloaded today in the same link).
And then I have to do this to compile the ODL controller:
cd controller/opendaylight/distribution/opendaylight
mvn clean install
But this path doesn exist on the version I have donwloaded.
¿In what directory I have to be to run the mvn clean install?
The ping example wiki is old and outdated. That was back when everything was in the controller project except for yangtools and before ODL was converted to use karaf. So the controller/opendaylight/distribution/opendaylight directory is long gone. So if you want to create and run the ping example, you would create a karaf feature and run the karaf distro in the controller project. You can follow what is done with the toaster sample and its associated wiki which is pretty up-to-date: https://wiki.opendaylight.org/view/OpenDaylight_Controller:MD-SAL:Toaster_Step-By-Step.
just run 'mvn clean install' in the root dir (so, the "controller" dir).
also, to be safe, I'd delete your "repository" directory in your .m2
dir (usually, in ~/.m2/repository).
Finally, make sure your mvn .settings.xml file is correct. here's a
link for that.

Maven: non lifecycle-related goal

I have project for which you need to have a few dependency downloadable from an internal repo.
If this is not possible I'd like to provide an option to install the deps locally (using maven-install-plugin).
Point is: since this is fully optional I do not want this to be part of the regular lifecycle (I've seen solutions where the install plugin is executed in the clean phase). Which is the terser and more elegant way to provide an option to install prerequisites locally?

Where can I find a tutorial for installing and running cascading.jruby?

I have Hadoop installed and testing fine, however unable to find any instructions for a n00b on
How to setup cascading and cascading.jruby. Where to place the cascading Jars and how to configure jading to build the ruby assemblies correctly?
Is anyone using jenkins to build this automatically?
Edit: more details
I'm trying to build the example word count job from https://github.com/etsy/cascading.jruby
I've installed
hadoop, and run the tests successfully.
installed jruby
gem install cascading.jruby
jade - https://github.com/etsy/jading
installed ant
created the wordcount sample wc.rb
Run jade to compile the wc.rb to a jar
jade wc.rb
I get the following compile error
Buildfile: build.xml does not exist!
Build failed
RuntimeError: Ant retrieve failed
(root) at /usr/bin/hjade:89
Which makes sense looking at the jade code, but this isn't covered in the example usage? What am I missing here?
Sorry for the delay; this is my first answer, here.
The issue you describe, Jading not being able to locate its Ant build script when called from a symlink, is indeed an issue. I'd recommend just adding your Jading clone to your PATH rather than creating symlinks (or submit a pull request to fix the issue!).
To address some of your other concerns, I've created a Getting Started page in the Jading wiki which may be of some help. It walks you through getting up and running with local and remote cascading.jruby jobs without installing anything besides preqs (Java, Ant, JRuby, and the Hadoop client+config). Included now is a full example wordcount script that should function both locally and on a Hadoop cluster, and has been tested on Etsy's own internal cluster.
And backing up further still to address your question about Jenkins, yes, at Etsy we use Jenkins to build and deploy cascading.jruby (and Scalding) to our cluster. However, that build process does not currently use Jading to produce the job jar. Our build predated Jading and Jading was an attempt to release a cleaner version of the process we go through to build that jar. Our build could easily using Jading (and the original examples came from actual uses on our code), but we have slightly different requirements for the artifacts produced by our build.
If you have any other issues with Jading, feel free to submit issues or pull requests to the github project.
If you are using jruby. You must be using bundler as well. In that case you can add cascading.jruby as a dependency in your gemfile.
You could anyways try installing from your project folder as:
gem install 'cascading.jruby'
Hope this Helps.
I've got the working end to end now.
I had created symlinks to the hadoop, jading binaries in /usr/local/bin
The scripts need to be run from their own directories in order to find the supporting files
i.e. the following works: (assuming the cascading.jruby example is in ~/dev/cascading.jruby.demo/wc.rb
cd /usr/local/jading
./jade ~/dev/cascading.jruby.demo/wc.rb
# creates a jade.jar locally in jading folder
cd /usr/local/hadoop
./bin/hadoop jar /usr/local/jading/jade.jar ~/dev/cascading.jruby.demo/wc.rb ~/dev/cascading.jruby.demo/sampledata/in.txt

Resources