Avoiding scanning third party libraries through Veracode's static scan - static-analysis

I'm fixing flaws from my application's Veracode static scan result, and I'm realizing that it is analyzing third party libraries in addition to my source code. For instance, it's looking at the Apache Commons libraries and it is finding flaws inside it.
How can I instruct Veracode not to scan those third party libraries?

It was solved. Including the jars in then war's buildpath instead of passing it with my code made the thing

Related

The idea behind using maven to compile source code

I am currently starting my adventure with Maven, and I actually don't understand the idea behind using it to automate compilation of my source code. For the time being I am working on small projects with up to 15-20 classes, and 1 main method in the "app" class. Could someone please give me the explanation with examples, when it's necesarry (or recommended) to use build automatation tool to compile the source code and how could I benefit from using it regarding source code compilation?
Thank you very much in advance!
I was looking for different answers and I have a lot of work to do but since I've seen this question, as a Maven fanboy, I couldn't resist anymore and this below is my answer.
First of all, I agree with JF Meier which answered before me, but I think the answer can be improved.
IMO you have to consider Maven not just as a build tool, but as a multi-purpose tool which can help you to do very different things. The best 3, for me are:
Compiler. Obviously. Maven allows you to easily compile giant projects with a lot of submodules, even if some of these modules are interdependent one with each other.
Dependency and repository manager. Maven allows you to automatically download third party software and bind this downlaod to the build. This is immediately understandable if you think to framework or api dependencies from big corps (Apache found., Spark, Spring, Hibernate and so on ...) but it's really powerful in every enterprise context.
Example: you have a Maven project (let's say project A) which manages requests coming from a webservice and provides responses. This Maven project relys on another Maven project (let's say project B) which actually generates webservice jar and uploads it to a company repository. Well, when you have to add a field or a method to the webservice you just have to implements new software in project B, upload it the repo and change the version in Maven poms in both project A and B. VoilĂ : now EVERY developer of the company just have to "mvn clean install" project A to have the new version.
Sources and code automatic generator. Since Maven 2.x are available a lot of plugins (from Apache found. and others) which allow you to generate code and sources (tipically xml files) starting from little to none implementations.
Example 1: CXF plugin is commonly used to generate java classes from xml or xsd files.
Example 2: JAXWS plugin is commonly used to generate wsdl from SOAP webservice implementations or implementation starting from wsdl file.
Do you feel the power now?
-Andrea
The question is not very specific, but I will try to answer.
Usually, you want your source code to end up in a jar or war, so that you can use it as a library or run it somewhere (e.g. on an application server).
Maven not only compiles the classes you have and creates the final artifact (jar, war), but also handles your dependencies, e.g. the libraries your project depends upon.

How can you access resources in transitive jar used in native image?

I am using a third-party library in my quarkus project. This third-party library has a transitive dependency which includes some inner resources.
These resources are loaded at runtime, and seem to work when executing my quarkus project in dev mode, however, when running the built native image, these resources are not found.
Is there a way to include this transitive dependency resources in the built native image? I tried to specifically include the library in my gradle dependency but that did not work.
Thanks.
By default, the resources are not included in the native image.
You need to include them yourself.
See our extensive documentation about the various issues you can have with GraalVM native executable and how to solve them here: https://quarkus.io/guides/writing-native-applications-tips#including-resources (the link points to your specific issue but better read the whole doc for a global understanding).

How to add Different flavors to android library in maven?

I have created a library, which is distributed with maven. Right now I would like to add new library as a dependency, which size is more than 8 mb.
I want to make that dependency as optional and this is why I think that creating a different library flavor would be correct way (might be wrong, I would be open to other solutions)
However, I searched and found that I can't do that with libraries..
Maybe something changed, or is there a different way to implement optional dependency to library, which should be managed by the user who integrates my library. I would like to keep same dependency name in order to maintain only single version of library.
Thanks in advance :)

Dependency on two versions of a Jar

I have a module X that is dependent on a Third party library which in turn depends on apache-commons-collections 2.1.
In module X, I want to use the latest apache-commons-collections 3.0 which has some additional methods than 2.1. If I add a dependency to 3.0, I'm guessing this will create a problem since the class loader just picks up the first class it sees in the classpath. Is there a good way to get around this problem?
Thanks,
S
IMHO there is no really good way without additional solution for modularity (like Java EE's EAR or OSGi). I guess however that you're asking about just simple web (or not) module that directly use this 3rd party lib. I'm afraid you have to resolve this conflict manually. If fact, Maven won't provide 2 versions of commons-collections and depend on classloader's resolution, but rather resolve dependencies graph and pick the version it guess it's better with your POMs' declarations in mind. That means, if you declare in module X dependency on commons-collections version 3.0, that version will be used since this declaration is more important than some 3rd party lib's dependencies.
That's a serious problem of Java Platform itself, cousing such problems like the famous JAR hell. Unfortunately, it is your problem to choose and declare commons-collections version that satisfy both you and your 3rd party lib.
Just add the dependency to 3.0 to your project and it will prefer it over the older version of 2.1. To be explicit you can add an exclusion. In any case use the dependency plugin and the analyze and tree goals to see what is happening.
Long story short... this happens all the time and will be fine and in any case you can control what happens.

When should I use Import-Package and when should I use Require-Bundle?

OSGi allows for dependencies to be determined via Import-Package, which just wires up a single package (exported from any bundle), and Require-Bundle, which wires up to a specific named bundle's exports.
In building a greenfield OSGi application, which approach should I use to represent dependencies? Most of the bundles will be internal, but there will be some dependencies on external (open-source) bundles.
I believe Require-Bundle is an Eclipse thing (that has now made it in the OSGi spec to accommodate Eclipse). The "pure" OSGi way is to use Import-Package, as it specifically decouples the package from the bundle that provides it. You should be declaring dependencies on functionality that you need (the Java API provided by a certain version of a certain package) instead of where that functionality is coming from (which should not matter to you). This keeps the composition of bundles more flexible.
JavaScript analogy: This is like detecting whether a web browser supports a certain API versus inferring from what the user-agent string says what kind of browser it is.
Peter Kriens of the OSGi Alliance has more to say about this on the OSGi blog.
Probably the only case where you need to use Require-Bundle is if you have split packages, that is a package that is spread across multiple bundles. Split packages are of course highly discouraged.
Favour Import-Package over Require-Bundle.
Require-Bundle:
specifies the explicit bundle (and version) to use. If a requirde bundle needs to be refactored and a package moved elsewhere, then dependents will need changes to their MANIFEST.MF
gives you accesss to ALL exports of the bundle, regardless of what they are, and regardless of whether you need them. If the parts you don't need have their own dependencies you will need those to
bundles can be re-exported
although discouraged, allows the use of split packages, ie: a package that is spread across multiple bundles
can be used for non-code dependencies, eg: resources, Help etc.
Import-Package:
looser coupling, only the package (and version) is specified and the run-time finds the required bundle
Actual implementations can be swaped out
Dependent packages can be moved to different bundles by the package owner
But requires more metadata to be maintained (i.e: each package name) at lower levels of granularity
I believe Import-Package gives you looser coupling and should be preferred. I use it when declaring dependencies on packages that I don't own, such as slf4j, and I can swap implementations as I wish. I use Require-Bundle when the dependency is something I have control over, such as my own bundles, because any important change would have gone through myself anyway.
Avoid Import-Package.
As packages provide many-to-many relationships between bundles, they are prone to dependency cycles that are hard to detect and avoid.
Require-Bundle on the other hand, references a single bundle, making dependency graph protected from cycles by a trivial build-time check.
With Require-Bundle it is much easier to build layered architecture with isolated lower level of abstraction.
Import-Package should be better because, as previously said, you can move a package from one bundle to another without changing existing client's MANIFEST.MF
But...
There is a practical reason to use Require-Bundle if you are using Eclipse to develop your bundles:
Eclipse don't use packages as units of resolution. It uses bundles. That is, if you use one package of a bundle, Eclipse compiles your bundle without reporting any problem with the use of the rest of packages not imported from that bundle.
You could (you are human) think that everything is OK and upload your bundle for deployment but ... your bundle will break at runtime.
I'm sure about it because this problem has happened (to me!) today.
The good solution would be to change the Eclipse classpath container but... if this is not going to be done... you could decide to avoid this kind of problems requiring bundles, instead of packages, paying the mentioned price (no backward compatible code movement between bundles).
I'm not convinced that using Import-Package is better, because my default expectation when working with a bundle is to work with the associated public API. For that reason, Require-Bundle makes more sense.

Resources