Dependency Walker Not Showing All the Depended Dll - unmanaged

I have a fortran dll, and I want to know the assemblies that it depends on for redistribution purpose.
One thing I found out is that the dependency walker doesn't show all of the dependencies, i.e, there are some dlls that my assembly is dependent on, but dependency walker doesn't show it out.
An example would be a dll that makes use of intel mkl LAPACK dlls, but the dependency walker doesn't show that dependency.
Why this is so? And any idea how to fix this problem, or is there other more reliable tool that I can use?

Dependency Walker will only show the static dependencies if you don't run it. Run it in profile, and it will show the assemblies that it dynamically loads.

Related

Will go build pack unused external dependencies?

I have a lib module planned to be used across projects. It will depend on many other modules.
But not all the projects will use functions provided by lib. I'm wondering if a project only depends on b/func B provided by lib, will it pack all the unused modules during project's build?
I understand go's smallest build unit is a pkg. In such case will ext-depency A module and ext-depency N module pack it into my project module binary?
How can I test this?
But not all the projects will use functions provided by lib. I'm wondering if a project only depends on b/func B provided by lib, will it pack all the unused modules during project's build?
No.
I understand go's smallest build unit is a pkg. In such case will ext-depency A module and ext-depency N module pack it into my project module binary?
No idea what you are asking.
How can I test this?
Inspect the generated binary. (No, don't do that).
Honestly, this is a 100% non problem. Nothing to see or worry here. Forget all this. The generated binary contains what is necessary and nothing else. Unused stuff is stripped during linking.

how to add dotnet compile time only dependency

I have a package which should only be a compile-time dependency, i.e.
included in the build but not part of output. Like "CopyLocal=false" works in non-sdk projects.
I tried <ExcludeAssets>runtime</ExcludeAssets> on PackageReference which sort of works but not consistently. Sometimes the dependency is also excluded from the build. (reopening the solution fixes it sometimes. It is all very random)
So I have 2 questions:
Is excluding "runtime" supposed to also exclude it from build or is that a bug?
Is there another way to include a dependency in the build but exclude it from runtime.
Background:
I have two types of dependencies where I need this functionality. One is a licensed product where a "generic" version of the assembly is used for the build. Similar to reference assemblies i Visual Studio. The real assembly is available in the production environment.
The second one is an assembly containing only constants. It is not needed at runtime since it is not used.
Excluding the constant-only assembly is just cosmetic but shipping the "generic" assemblies causes problems on e.g. updates where they can overwrite the real ones.

Direct dependency vs Transitive dependency in build system

I was studying Maven's build system and it adds a lot of transitive dependencies because of its transitive dependency system (maven itself does not add dependencies but transitive dependency system does). I see issues with it like major version conflicts and unknown dependencies coming in.
I was thinking why is the system designed this way and why not take direct dependencies. My library does not need to depend on something which my dependency is using but not my library (I mean I understand why it needs to be included in the build list, my dependencies need to build using those, but why does it needs to cause major version conflict?). Am I missing something fundamental here? One thing that I can think of is that my library's build dependency list can grow to be very big because of all the direct dependencies I will need to take, but that does seem to be as big of a problem as problems with transitive dependency system.
I am new to build systems, so please don't be too harsh. I also tried to google this question but didn't find useful answers but please feel free to comment anything that I might have missed.
Thanks
If you need library A to run, and library A needs library B to run, and this needs C to run, it is very tedious to figure this out and add all the relevant dependencies to your project.
Before Maven and Gradle, many people worked that way and found out, that it is much easier to let a build tool figure out the transitive dependencies.
My library does not need to depend on something which my dependency is using but not my library [...]
This is your major misconception. There are two possibilities:
The direct dependency of your library exposes types from the transitive dependency in its public API. To use this public API you need to access these types, so you need the transitive dependencies during compile time.
The direct dependency of your library only uses its own dependency internally, but not in its public API. In this case, your library does not need to depend on the transitive dependency during compile time. But as soon as your library code runs (even in a test), it may use some functionality of its direct dependency that internally uses functionality of the transitive dependency, causing your library code to fail.
[...] I mean I understand why it needs to be included in the build list, my dependencies need to build using those [...]
There is no actual build list (or order) for external dependencies, because they are used when they are already built (the downloaded .jar files contain compiled .class files). But as I mentioned above, you will need the transitive dependencies either during compile time or during runtime (e.g. tests), so your build system (Maven or Gradle) will fetch them for you.
[...] but why does it needs to cause major version conflict?
#khmarbaise already explained in his comment, why and how version conflict between transitive dependencies may occur:
You are using two libs X and Y. Both of them using another lib (A) So X is using A in version 1.0.0 but Y is using A in version 2.0.0. In the end you can't have both on the classpath there must be done a decision for one version. So depending on how X,Y are implemented either X can fail while using A in V1.0.0 or Y can fail in using A in V1.0.0 or with V2.0.0 the same... This can happen if X or Y are being updated. This is also true for different version combinations like A in 1.0.0 and 1.1.0 (if compatibility is not 100%)

Opencv in Maven Project

I have a question please,when using opencv in a Maven project,do I need to install opencv in my computer ?
I've added the dependency in pom.xml and the dependency was added but when running the code it said no opencv in java.library.path. It needs to specify the dll path
Thanks for your help.
"When using opencv in a Maven project, do I need to install opencv in
my computer?"
Yes, you do.
When adding dependencies you are telling the compiler where to find the library you want to use. You may not need the library to be accessible from your computer, but the files definitely need to be accessible from your project directory.
Links that may help:
Maven: Introduction to the Dependency Mechanism
Understanding Dependencies

Maven shade relocations across all modules?

I have a maven multimodule project that has one parent pom-project and a bunch modules. One of these modules is the "main module" that has all the libraries shaded into it. All other modules depend on that module and use the provided libraries.
The main module is a Bukkit plugin that loads the other modules as extensions. These extensions are loaded all with their own classloader, but the loaded classes are shared between the loaders to be able to depend on each other. They are also able to depend on other Bukkit plugins, as their parent classloader is Bukkit's PluginClassLoader that also shares the loaded classes between plugins to allow interaction.
That's where the problems start: Different plugins may use the same library, but the classes of that library might get loaded by different classloaders which causes LinkageErrors and other problems.
My idea to solve that problem was to relocate the libraries in the main module via maven-shade-plugin. That works as expected with libraries that are only used by the main module. However relocating libraries used by the other modules causes runtime ClassNotFoundExceptions, because the modules still search for the normal package name instead of the relocated one.
Then I tried to change the imports to the relocated packages, but my IDE (IntelliJ) doesn't find the classes.
Has anyone an idea on how to solve this relocation problem? Or maybe different approaches on the classloading issue?
5 years later in a very similar context (Bukkit -> SpongeApi) I encountered this problem again, but this time I found the (probably only satisfying) solution:
The main module had its shaded version as the main artifact, so dependents could only see relocated classes and were unaware of the original classnames. This made no difference in our case, as the main module is a provided dependency anyway, but it also prevents consumers from accidentally using relocated classes directly. IntelliJ does not care for the relocations, so it was unaware of the new relocated classes. Attaching the shaded version as a secondary artifact (shadedArtifactAttached option set to true) makes the dependencies visible to the dependents again.
The dependent modules have to apply the same relocation rule as the main module, so the plugin corrects the classnames to the ones available at runtime.
This way IntelliJ is not aware of the relocations but it also doesn't need to be aware. If necessary, the relocations can be configured in a parent pom for consistant rules across all projects.
I had almost exactly the same problem you have/had (judging from the age of this question). Although I don't have a cleaner solution for the libraries overriding other plugins' versions, I do have a workaround for IntelliJ not recognizing relocated classes.
To stop it from complaining, I added the shaded jar (with the relocations) as IntelliJ library to the target module.
You can do this like so:
Go to File > Project Structure... > Modules > (target module) > Dependencies
Select the shaded jar using Add (green +) > 1. Jars or directories....
You should now see the shaded jar in the libaries list
Although it seems to work at first glance, this solution workaround has a few caveats:
Non-relocated classes are still visible to code through Maven's module dependency and if you happen to use them, you'll only see that upon compiling with Maven. (You could remove the module dependency, but it gets readded every time you reimport your pom)
You'd have to update the jar path every time you change your project's version if you include a version number in your jar file name (Workaround: Specify a static project.build.finalName)
When you add new methods or change signatures, you need to compile the library module again. (This can be worked around by creating a separate module for shading dependencies - That would actually also resolve the file name issue)

Resources