I've got a hello world opendaylight app (created following the tutorials) which compiles using a mvn clean install, and appears when I run the karaf package that is also generated.
However I am unable to get it to run in another ODL install (downloaded the binary with all the other packages from the website), and even using a
bundle:install mvn:org.andrew.test
results in unable to install bundle (tried copying to deploy and system/org/andrew....)
How do you get a bundle which can be used in another install?
Why do you want to bundle:install instead of feature:install ?
What most existing ODL projects do for you, and what the example generated by the archetype should also show you how to do for your custom org.andrew.test one (have you used the archetype? try it..) is that there is a local karaf/ artifact which correctly depends on the features/odl-something feature of your example, and lets you install it, which will install your bundle/s.
In theory and if you really know what you are doing, you can also get it to work for what you call "in another install" in your question, but you have to use repo-app and what not - most people do not use it like that AFAIK (at least in ODL development); so I wouldn't bother, if I were you.
If you want to learn more about this in general outside of OpenDaylight, the general Karaf documentation manual is not bad. Beware that in ODL we've tweaked a few things though; for example, we have (intentionally) disabled the direct installation from ~/.m2/repository (for better isolation).
Related
This is an inverse of my question Install a custom feature or module in Opendaylight?. I am looking to take the Hello World app and add the ability to do a feature:install for the following features:
feature:install
odl-restconf
odl-mdsal-apidocs
odl-openflowplugin-flow-services
odl-openflowplugin-app-table-miss-enforcer
odl-openflowplugin-nxm-extensions
odl-restconf-all
odl-openflowplugin-flow-services
I assume it comes down to listing these features properly in one of the many pom files.
For the record, currently we run the ODL 0.12.1 by downloading the .tar.gz from ODL's nexus server, shell into karaf, and run the feature:install command against all the above features and I am able to do the install.
I'd like to be able to run that same feature:install for all those features as well, but in the Hello World example, karaf can't find those features.
There are some similar questions out there (such as this How to add new features to OpenDayLight Karaf?) however the answers weren't specific enough and seem generic to Karaf. For example, the answer there seems to be about modifying the values of org.ops4j.pax.url.mvn.repositories however when I look at the ODL 0.12.1 integration/distribution repo, I do not see this value being used at all.
I think what you are looking for is the featuresBoot config in the
etc/org.apache.karaf.features.cfg file.
The recommended way to use Gradle is through the Gradle Wrapper, (gradlew), which is checked into version control of the project.
My question is: is there any reason to install Gradle myself from http://www.gradle.org/downloads instead of using the wrapper everywhere? (and copying the wrapper to new projects from an older project)
If You work with gradle occasionally (not with one particular project (or a set of projects)) it's very useful to have gradle installed on command line. Then You can easily create a script and check if it works fine. With gradle installed on CL it's very easy and fast (no need to download the whole distribution every time). Beside this one particular use case nothing else comes to my head.
P.S. There's a great tool for gradle (and other tools) version management: GVM.
I've developed a project that has a bundle whose only purpose is to write a file to a certain location on all of the containers running it.
This file will change often, but does not really constitute an increase in version number. I also don't want to have 100 versions of this bundle in my repository. So I have left it as a snapshot. This question would also apply if I was doing active development on a project for fuse fabric.
Once built, I deploy the bundle to my fabric's maven proxy with:
mvn deploy:deploy-file -Dfile=target/file-1.0-SNAPSHOT.jar -DartifactId=file -DgroupId=com.some.id -Dversion=1.0.0 -Dtype=bundle -Durl=http:// username:password#hostname:port/maven/upload
I can then add my bundle to a profile with:
mvn:com.some.id/file/1.0.0
This works the first time.
Then I make a change to the file, rebuild the bundle, and deploy with exactly the same command. I remove the bundle from the profile and add it back in. The maven proxy on the fabric server has the new bundle in it if I check $FUSE_HOME/data/maven/proxy/com/some/id/file/1.0.0/
But on all of the containers running the profile on a separate server, the bundle is not updated. I assume because the version has not changed. However, fabric should be smart enough to tell the difference, as the md5 should be different.
For now I can change the version number and my problem is solved, or clear the maven proxy by hand. But in production I will not be able to clear the proxy on every server, nor can I expect someone to come up with a unique version for the bundle every time they make a small change to this file (which should happen often).
I have already tried adding updatePolicy=always to the fabric maven configuration, but I believe that only affects repositories that it is pulling from, not the proxy.
Any advice on the best way to solve this problem is welcome.
If you are using containers, your old artefacts must be cached in
$FUSE_HOME/instances/CONTAINER_NAME/data/maven/agent/
Delete the old artefacts from here and stop/start your container.
I am working to automate the install of some software.
It relies on some things like the Java JDK and well lots of things that have manual steps installing and copying things around.
I would like to be able to test if the various packages are installed and if not install or update them.
How likely is it that I can get MSBuild to do this sort of work? If unlikely then where can I look?
Thanks
The answer is Yes. MSBuild can execute any command -- as long as that command does not expect user to be in front of the computer. I know you can do silent JDK install, so you can just execute that command in your MSBuild target.
However a more interesting question is: should you do this? I think that performing machine-wide configuration steps as part of the build is bad practice. For certain things, like deployment of your newly built product for CI cycle it is ok, but for the purpose of the build it will be very inflexible.
What I would recommend in case of JDK: since JDK is big and mostly backwards-compatible, in your build script check if correct version of JDK exists on the machine. If it does not, fail the build and print out instruction in the log how to configure machine. For smaller dependencies, see this SO question.
We have a number of small projects within our system running on Linux (Slackware 7-11, slowly migrating to RHEL 6.0). Around 50-100 applications and 15-20 libraries. Almost all our applications use one or more of our libraries. Our source tree looks something like this:
/app1
/app2
/app3
/include
/foo/app4
/foo/app5
/foo/app6
/foo/lib1
/foo/lib2
/lib/lib3
/lib/lib4
/lib/include
Now, I've done some work creating some CMakeLists.txt files and built most of the libs and some of the apps. I'm fairly comfortable with using cmake to build. I did this with v2.6, and I recently (an hour ago) upgraded to 2.8. Each of the above projects have their own CMakeLists.txt file specific to the project to do building and installation (no packaging, yet).
I have a requirement to make use of and enforce continuous integration. I've installed and played around with Jenkins, and from what I've seen I'm very impressed. I'm also evaluating JIRA to do our issue tracking.
Just to get things up and going, I've done a cmake install on all the libs, so the apps can find them in the filesystem. Headers are installed to /usr/local/include and libs to /usr/local/lib. Is this a bad thing to do? Would it be better to tell cmake to look for the lib's source directory, use the export interface or the recently introduced ExternalProject_Add?
Because I'm going to be using Jenkins, I cannot be guaranteed that cmake can find the source or build directory. Of course, I can tell Jenkins to build the projects in order (or at least, build the dependencies first). If an update to a library breaks the building of another project, then I guess it'll be up to someone with 3/4 of a wit to determine this.
Thank you in advance
Just to get things up and going, I've done a cmake install on all the libs, so the apps can find them in the filesystem. Headers are installed to /usr/local/include and libs to /usr/local/lib. Is this a bad thing to do?
No it is not a bad thing to do, but your build should reproduce resources from scratch. Things like portability and fixing build bugs will become an issue if things need to be pre-installed in the system outside of the build process. If you are able to do it other ways as you mentioned I would suggest that way, but if its going to make your build that much longer, its something you need to feel out. My ideology is everything should be movable to a new Jenkins machine with a fresh install at the drop of a hat, again this always isn't achievable, but something to strive for.
Because I'm going to be using Jenkins, I cannot be guaranteed that
cmake can find the source or build directory. Of course, I can tell
Jenkins to build the projects in order (or at least, build the
dependencies first). If an update to a library breaks the building of
another project, then I guess it'll be up to someone with 3/4 of a wit
to determine this.
Well one of the things I do in interdependent jobs is that on the successful building of one jobs triggers the job that depends on it. So for example if A depends on B, and A fail, B will never be run and whoever created the issue in build A is responsible for it and so on. This prevents a cascading affect of broken build that all were caused by a broken dependency. I would suggest that you keep files in a particular build in its job folder and specify to the dependency the location of the required files. Again keep your builds separate and clean.
I'm also evaluating JIRA to do our issue tracking.
I highly recommend JIRA as an issue tracking system for company; You might want to look at this Jenkins plugin for integration. If your using git, and you dont mind hosting your code off site, I would GitHub issues a shot as well.
Goodluck you seem to be on the right track.