Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
I have several projects that build reusable libraries. All these projects are under source control.
When I use these libraries in a project I simply link to the same ONE version on my local drive. However as you can imagine, this can cause problems when I commit back, and a different developer tries to clone the repository.
What is the best practice when using components also under source control? Should I include the "library projects" in the "main project" source control? Will this cause problems?
NB: The libraries take quite a few compiler directives so its almost impossible to just compile a static version and link to that. Plus I'm still developing them in parallel.
You have two main kind of dependencies:
source dependencies (you need to include, within the sources of your project, source from another project),
binary dependencies (you need to include a packaged set of files, like the ones found in a shared library).
If, when you say "I use these libraries in a project", you mean you need the binaries in order for your project to compile, then you could store said binaries in an external repository (i.e. not a (D)VCS like Mercurial, but an artifact repository like Nexus)
But if you mean you need to include sources, because you are also making some evolutions to those libraries while using them to develop your project, then Mercurial subrepos are a better fit.
In my own experience, maintaining compatibility with libraries that you are writing simultaneously is drastically improved by using export maps to provide multiple versions of your interfaces to client programs. The best guide I know of is Ulrich Drepper's http://people.redhat.com/drepper/dsohowto.pdf
If the libraries are under your source control, life should be easy. What I tend to do is the same as I do for different versions of third party libraries: Have different folders for different versions.
The third party library folder structure looks like this:
- General
- Delphi
- Components
- LibX
- LibX 9.2.1.3890
- LibX 10.1.0.7151
- LibY
- LibY 3.6
- LibY 5.1
- Plugins
Each and every project defines it dependencies on specific versions of each library. Reverting back to an old version of a project, thus also reverts the dependency to older versions of the library(ies).
Now with third party libraries you generally don't have as many different versions as you can do with your own libraries, but the same principals apply. And to aid in "current development" - where you don't have a particular version number yet, you could simply have a "head" version. Then when you "release" a version of your library, just add that version's folder and adjust the project definitions that have up till know used the "head" because of parallel development, to depend on the new version number...
Related
I would love to use vcpkg to manage our dependencies with third party libraries, but I'm not sure how this is would work in our environment. We have hundreds of projects in our solution, and release new versions of our software over time. This is a very common situation for software development, but I don't understand how vcpkg can work effectively.
Global installation will not work. We need to tie our releases to specific versions of packages. So we're left with using manifests. In a manifest, you specify a "builtin-baseline" which is a great way to pin the build of a project to a point in time, with some guarantees that the dependencies between the different packages are correct.
This is all great for a project. How does one manage hundreds of projects?
We would like to use the same baseline across all projects. Is there a way to specify the builtin-baseline in one place?
Some packages are common to all projects. Is there a way to specify this in one place?
We use project files for building. By default, vcpkg uses a project-local install directory. We don't need 100s of copies of boost. There is a vcpkg project setting for "Installed Directory" which allows a global location. But again, this setting is per-project. I don't see any integration with .props files. Is there a way to manage this?
Is there a way to get a solution-wide listing of all packages?
Would using CMake make this all easier?
A dependency that we rely on (https://github.com/bytedeco/javacpp-presets/tree/master/ffmpeg) has recently split into a LGPL and GPL version, depending on how the underlying ffmpeg is configured.
There are two different sets of artifacts that are released (e.g. https://mvnrepository.com/artifact/org.bytedeco/ffmpeg-platform-gpl/4.3.2-1.5.5 and https://mvnrepository.com/artifact/org.bytedeco/ffmpeg-platform/4.3.2-1.5.5).
The API is the same for our purposes - we don't need to change our code. Its dynamic at runtime in terms of what is supported, but that is OK. Its already flexible given different hardware support.
I'd like to build two sets of artifacts as a parallel path up through our tree (e.g. two versions of core, api, viewer, examples, etc) as jar with different license dependencies to allow the user to choose which one they prefer. The goal is that the user can choose a particular version of our code and the dependencies "just work" in that the right dependencies are either included in the uber-jar or get pulled in via maven.
So I have a jmisb-api-lgpl-${version}.pom/jar (and maybe jar-with-dependency uber-jar) that depends on jmisb-core-lgpl-${version}.[pom, jar] that in turn depends on ffmpeg-platform-${other version}.[pon,jar]. And built at the same time, jmisb-api-gpl-${version}.pom/jar that depends on jmisb-core-gpl-${version}.[pom,jar] that in turn depends on ffmpeg-platform-gpl-${other version}.jar.
What is the preferred / recommended way to do that, or alternatively, what is a relatively clean way that builds both without needing to manually select which one to generate?
I have read the following essay:
Resources (library) in JSF 2.0
It's about versioning of web resources for JSF2 project.
May I know how could I do when my project is Maven-based? What need to be setup or added in the Maven setting?
Thanks a lot.
Effectively your question is a combination of three existing questions in Stackoverflow.
One is already superfluous since you already know how JSF versioning works, you just (unfortunately) referred an external site and not the existing question but from this question you need to 'remember' the format of the version number that needs to be used.
The second part should also not have been to difficult to come up with. JSF is a runtime framework and has no build/deploy time features. But you already use maven, so what is needed to build time copy(move?) the resources to a new location. The source folder can differ, it can be an additional resources folder e.g. src/main/myresources, that you don't treat as a resources (since it would end up in the classes folder then by default) or it can be a folder inside the webcontent and then you need to move (copy/delete). This is all for you to find out.
The third part is that the destination folder needs the version in the right format
1_0
1_1
1_2_3
Since this differs from the project version format, you need to search/replace this
which includes the project version in the right format. There are features for this as well in maven.
So you see, it all boils down to breaking a problem down into manageable parts...
I am using this third-party view control in my project https://github.com/nicklockwood/SwipeView That project does not support CocoaPods or Carthage.
Note: my entire project is Swift and this other code is Objective-C.
To integrate this into my project I just dragged in the .h and .m files into my project.
Is this the proper approach? Instead should I have created a new target and put those files in there? Are there any considerations for me to make this choice?
I've used this technique (separate targets building static libraries) for third party code - even when that library does support CocoaPods or Carthage.
One benefit for me was when there were breaking changes, particularly with newer versions of the tool chain, before the third party code was updated. It was straightforward to disable e.g. certain newer compiler warnings in that target alone while keeping the rest of the project as clean / safe as possible.
It's also reasonably tidy when mixing Obj-C and Swift.
I haven't found a downside, as long as you're happy to manage the project integration yourself (that might otherwise be handled by CocoaPods). I only tend to have one or two third-party libraries in my macOS projects.
We check all of our source code's dependent third-party JARs into source control along with our source code. When needed, we manually download updates to third party JARs and replace those JARs that are in source control with the newer versions. We haven't felt the need to use Maven yet as this process seems simple enough for us. But are we missing something of great value by not using Maven? Or does our scenario not warrant using Maven?
"JARs dont change much", I hear this all the time.....
Storing jars in the SCM is simple in the beginning of the project. Over time the number of jars gets larger and larger.... Wait 2 or 3 years and nobody remembers where the jars came from, what their licensing terms were and most commonly what versions are being used (important to know when analysing security vulnerabilities).....
The best article I've read recently making the case for a repository manager is:
http://www.sonatype.com/people/2012/07/wait-you-dont-have-a-repository-manager/
A little irreverant, but does make a valid point about the kind of technical inertia one encounters all the time.
Switching a project team from ANT to Maven can be scary.... Maven works quite differently, so I find it is best deployed with greenfield or adventurous project teams. For the old-school ANT users, I recommend using the Apache ivy plugin. Ivy allows such teams to outsource the management of their dependencies but keep the build technology they're comfortable with.
Ultimately the biggest benefit of using Maven are not dependency management. It's the standized build process. I've seen several failed attempts to create a "standard" ANT build process. Problem every build engineer has his opinion on what the standard should be.... Maven's approach of forcing users to write build plugins may appear restrictive in the beginning, but just like the iPhone eventually developers discover "there's a Maven plugin for that" :-)
When it comes to dependency management Maven really can be quite valuable. As Mark O'Connor suggests, running a local repository manager would likely be better than checking the artifacts into source control.
There are many tools (like m2e in eclipse) that can help with dependency management and provide valuable feedback on which modules or dependencies require which other dependencies. Maven will also make sure to get the appropriate version of a dependency even if different modules depend on different versions of a given library. That will help prevent duplicate versions of the same jar showing up in your deployed project as long as they have the same group and artifact id.
Even for a very simple project I don't think I would resort to checking dependencies into the source control system.
It's not only about 3rd Party Libraries. Mostly if you have multiple repositories. In our case, we had four repositories with lots of inter- and intra-dependencies.
Actually I started this answer and then I had to go for 15 minutes to talk to some colleague about a problem happened after someone forgot to update the .jar of one project in the other's lib directory.
And it looks more professional :)