The CDI or ArC reference documentation of Quarkus does not even mention the name ArC, except in package names.
Is ArC an acronym? Does it mean anything except the CDI flavour of Quarkus?
In fact, the full name is ArC DI and it's a reference to arc welding. As you probably know, Weld is the CDI reference implementation and an open source project sponsored by Red Hat.
Related
I created a new .NET Core 1.1 solution and noticed an odd behavior: if I create multiple projects in the solution and chain-reference them, I'm able to freely access types located in a dependency of a dependency, any level down.
This is an example:
We have a Sandbox solution, a Baz class library project, a Bar class library project referencing Baz and a Foo console app project referencing Bar.
From the Foo project I'm able to access and use BazThing, a type defined in the Baz project, even though Foo doesn't have a reference on Baz.
This works with NuGet packages too: if I add Entity Framework to Baz through NuGet I'm able to use the DbContext type in the Foo project.
This is a huge problem when the projects are used to implement layer segregation.
Developers are now allowed to access and use implementation details of the dependencies or bypass layer segregation, both "maliciously" and by mistake (in the above mentioned example, IntelliSense will happily suggests to use BazThings when typing ba without any warning).
Is this how things will work from now on, or are we missing something?
Is it possible to prevent/inhibit this behavior somehow?
Where can I find documentation about this behavior?
That is intended behavior of modern NuGet/msbuild. It is called meta-packages/transitive dependencies and used for example for the NETStandard.Library package to include all libraries of the base class library. I do not think there is a way hiding them. Expect this feature to be added to full .NET Framework projects within the next releases of VS.
Aside of your questions, my personal opinion here is that hiding artifacts behind reference trees is maybe useful at first sight but does not bring any architectural benefit. A loaded assembly can be invoked. One way or the other. Layering, layer bridging and adherence to architecture can be only teached/learned/reviewed/documented. Teach the devs but do not build walls around them. Discipline should not be enforced by the artifacts but by the devs themselves.
I have been trying to use ShareKit in my project, but it came up ARC restrictions so I've disable ARC in all the share kit files using the-fno-objc-arccompiler flag.
But now I've turned off arc theres loads of errors and issues. Does anyone know a solution to this or do i have to debug all the code ?
Also I've looked on the web but can't seem to find it but does an ARC compatible version of share kit exist ?
thanks in advance
The errors are probably due to using the weak qualifier for a property in one of your Objective C classes being synthesized in a non-ARC class: MRC can't do this. Instead, use unsafe_unretained or the MRC equivalent assign.
Remember, the Clang Static Analyzer is your friend, especially for MRC and CoreFoundation code.
I'm getting the following validation error on my layer diagram:
Error 65 AV0001 : Invalid Dependency : Weld.Interface.Core(Assembly) --> Weld.Interface(Namespace)
Layers: Application Framework Core, Application Framework | Dependencies: Namespace Reference D:\Projects\Windows Projects\Weld\Weld\ModelingProject1\Weld.layerdiagram 0 0 ModelingProject1
These 2 assemblies are set up as different layers and the assembly name represents the namespace starting point as well.
Weld.Interface.Core: This assembly and namespace does not have a reference to Weld.Interface and only references .NET Framework classes
Weld.Interface: This assembly and namespace does not have a reference to Weld.Interface.Core
There is no dependency between these two layers in the dependency diagram. I am confused why I am getting this error. No dependency in the project or code, and no dependency is even setup in the layer diagram.
Somehow the Validation logic in the layer diagram is seeing a non existent dependency and saying it is an error.
Any ideas what either I might have missed or what is causing this problem?
OK, I figured out what was going on with my situation and thought I'd pass it along here. It appears to be an issue of cached references to assemblies. When the modeling project was originally created, it was in its own solution, separate from the assemblies it's intending to model. As such, it required listing them within the Layer References for the Modeling project.
The project has since been incorporated into the overall solution, but I believe the project references originally declared when separate from the solution kept precedence and, ultimately, was referencing old code. I removed the assemblies from the Layer References of the Modeling project and everything was smooth sailing after that.
I've got a .net solution (written in C++/CLI) which references some .dll projects as follows:
MainProject->ProjectA->ProbjectB
MainProject->ProjectB
Originally I'd referenced both ProjectA and ProjectB from MainProject which gave me the warnings as mentioned above.
I can remove the warnings by removing the reference to ProjectB from ProjectMain but it makes it less obvious that MainProject relies on ProjectB. Is this the right thing to do to get rid of the warnings?
Speaking in general terms, a system of dependencies can be depicted by a directed graph where each node is a software component and each edge is a dependency. In my opinion, the more simplification that can be done to the graph, the better.
Yeah that's fine.
If you have ReSharper, you can view the dependency graph by right-clicking ProjectMain --> Project Hierarchy.
I just want to describe, but not explain, following relevant behaviour.
project CSCommon in C#
project CS1 in C#, using CSCommon
project CPP1 in C++, using CSCommon
project CPPMain, using CPP1
If each project has its own output path, I recieve C4945.
If all projects have common outputh path, warning disappears.
I had the same problem as you. And I solved it exactly as you described it: remove the reference to 'Project B' (in your specific case).
That is the only way I know how to fix this error, short of disabling it.
No, removing the reference is probably not the correct way to handle it.
See https://stackoverflow.com/a/12423588/321013
Set Use Dependencies in Build=false for your references.
The point is, that you should have all references that the code in the project itself uses as direct references, but the setting Use Dependencies in Build=TRUE interferes with that, as it pulls in the transitive references also, generating conflicts if you also have direct references. (At least on my VS2005)
To what extent, if any, is MEF a replacement for PRISM?
Today I would say Prism and MEF complement each other. Just as Prism and Unity. Prism introduces a set of specific services like RegionManager, DelegateCommand, and EventAggregator which aid in building composite apps. MEF on the other hand is a more general composition mechanism for extensibility of applications and frameworks whether they are composites or no. The key distinguisher about MEF is it's discoverability which means that it can go out and discover all the available parts dynamically.
You might be interested in checking out the MEF contrib project (mefcontrib.codeplex.com) which contains an integration layer for Unity and MEF. With that extension, Unity manages MEF behind the scenes, so you are not contending with two contianers. The advantage is it allows you to use Unity for general Pocos, and MEF for discovery of extensions. Thus as Prism is currently built on Unity, you can use it to leverage MEF. To use the contrib project, you'll have to make some slight changes to your Unity Bootstrapper, but it should be fairly trivial.
There is definitely some overlap. The place where it's the most prominent is with regard to modules. Prism uses an IModule as a means of discovery. In MEF, any component can be a part and can be dynamically discovered. This means with MEF you have modularity from top to bottom, wheras with Prism, modules are more granular units. Composite applications is definitely an area we are conerned with on the MEF time. Over time it is quite likely you will see more and more support for building those types of apps within MEF itself. We're working with p&p to ensure that as that happens, there is a smooth transition.
Edit: Do not read this answer. It is embarrassingly wrong. I am fail. Read Glenn Block's below.
It's not obvious, but this is the same question:
Managed Extensibility Framework (MEF) vs. Composite UI Application Block (CAB)
Consensus in the duplicate post is that MEF and Prism provide the same basic set of functionality in different ways, except that Prism provides the Event Aggregator, which is a pub-sub means of communication between application components. You can use this with MEF, however. It's pretty much up to preference, really.
Take a look at this Sparkling Client podcast on MEF and Prism.
MEF will never replace prism
MEF is a dependency injection manager.its not a dependency injection container.
MEF provides ability to assign exports and imports delclaritively using attributes.
Prism with MEF gives you ability to ability to auto discover dlls and ability to add and remove plugins by adding or deleting dlls.
Where as prism framework gives event aggregator, region manager, service locator.
You can use prism without MEF. There are various other options like ninject, unity and other DI containers.
You can use MEF with prism for building plugin based extensible applications.