Two Autofac containers competing in a single webrequest - asp.net-mvc-3

A bit of a strange situation: I'm developing an ASP.NET MVC 3 application that uses Autofac as its IoC container and that can be dropped into any existing MVC or WebForms application.
Everything works fine, except when the other application also uses Autofac. I've created a custom DependencyResolver wrapper that combines my application's AutofacDependencyResolver with DependencyReolver.Current of the other application (when set). Resolving of components just fails sometimes although I can see in the debugger that everything is properly registered.
I suspect that both inner resolvers are competing for a spot in the HttpContext.Items collection, but I can't get my finger behind the exact issue.
What would be the proper way to handle this situation?

You could isolate the configuration into a module. When you integrate one application into the other, register the module.

Related

Is the MVC framework sometimes used as only the user interface layer when using dependency injection?

I'm trying to learn dependency injection. The book/example I am following seems to be using an MVC project as just the UI layer within a broader architecture. The example includes a separate project for the domain layer and yet another project for the data access layer.
When I first learned MVC I came away thinking MVC was the entire architecture. V for view for UI layer, C for controller for domain layer, and M for model for data access layer.
So is using an MVC project as only the UI layer a proper and/or commonly accepted application of the MVC framework?
So is using an MVC project as only the UI layer a proper and/or commonly accepted application of the MVC framework?
Yes.
While it is possible to make an application entirely within the context of ASP.NET MVC, doing so means that the application will have to be written from scratch to use a different UI framework. Isolating the business logic into a separate set of services that are not coupled to ASP.NET MVC means that only the top layer would need to be replaced to move to a different UI framework, which also means that the application's lifecycle may extend beyond the end of ASP.NET MVC and/or it can be made into an application with a different UI framework (WebApi, WPF, etc) without too much trouble.
The purpose of dependency injection is to decouple your services from all other parts of the application, including each other. So by extension, it is only natural to build the business layer separately from the UI layer. Whether you physically have them in one assembly or multiple is really just a matter of preference.
Applying SRP to the MVC design pattern will lead you there. Same goes for MVVM. You are extracting logic from Model to other classes like Interactors, Services, Repositories etc.
From any point of view this is perfectly normal(and desirable). Your Model is just an abstraction of Several different layers.
I would suggest you to take a look at VIPER (not a car) - https://www.objc.io/issues/13-architecture/viper/ and you will see something that is occuring to you right now.

ASP.Net Core MVC/WebApi - talking to each other over HTTP?

I have a theoretical (and probably foolish) question about web api and an mvc webapp in .Net Core.Seeing as they overlap in Core, it's even easier to have an api and visual app in the same project, responding to different routes - great!
But, there's something architecturally fundamental which I always struggle with : Let's say I had a web api, which other services use to do some data things and some custom actions etc. Then, I also have a big mvc app, this is the central app (let's say for argument).. Here's the question.
1) Should the mvc app be making webapi calls over http(even though they will likely run on the same box - they dont have to of course), 2) Should the mvc app be sharing the service and repository classes with the webapi?3) Should the mvc app be programatically calling webapi from mvc?(i.e. injecting the webapi classes into mvc controllers - therefore having a dependency on deployment, but abstracting the api logic and avoiding http)
Please don't let me know if this is stupid question...

what's the 'right' way to do a MVC for JSPs in Java EE 5?

I've inherited an incomplete but small web project (Java EE 5, running on WebSphere 7).
The project consists mostly of JSPs that are accessed directly via their URL, and most JSPs look up their own reference to the EJBs (services) they need. Also, there's a Servlet for every form that gets submitted by the HTML code in the JSPs.
Architecturally speaking, is there anything wrong with this?
I was thinking it would be better to have an MVC design. I don't want to convert everything to JSF because I don't want to convert all the HTML and embedded Java scriptlets into JSF tags and managed beans.
I don't really want to use Struts or Spring MVC because they're not part of the Java EE 5 toolkit that comes out of the box with WebSphere, and I don't want to add additional complexity with the additional libs and config files.
I was thinking about building my own little MVC with a "ControllerServlet" that accepts a command and dynamically build and execute the command object, and redirect to the JSP view.
But I ask myself again, is there anything "wrong" with JSPs that post to Servlets? It's actually kind of elegant in its simplicity.
What do you think?
Any suggestions are GREATLY appreciated! Rob
You're asking a rather subjective/localized question. But ala.
There's technically nothing wrong with individual JSPs that submit to individual servlets. The only real problem is when the servlets turn out to contain duplicated code for quite common tasks like collecting request parameters, converting/validating them, setting bean properties, invoking actions, performing navigation. That is not DRY and is what a MVC framework with a single front controller and a well definied lifecycle is supposed to solve.
Or, if the servlet's tasks are actually well refactored with homegrown code to perform those common tasks, then this is in turn not very maintainable as no one else than the original developer knows the ins and outs of this custom framework. So it's hard to find anyone else willing to maintain this webapp without learning another framework again which the new developer wouldn't likely to see in other future webapps. That is why companies usually adopt an existing and well-developed MVC framework like JSF, Spring MVC, Stripes, Struts, etc.

Event handling mechanism between ManagedBeans in JSF2?

Is there a way to decouple ManagedBeans from each other in a way that it is possible to send and receive custom events - probably over the (cool) FacesContext?! I do not want to inject Beans as ManagedProperty, to reduce direct dependencies. Unfortunately #ListenerFor and all that new stuff does only work for components and renderers and seems completely the wrong approach.
Those of you who are familiar with Adobe Flex' event mechanism know what I mean and what I expect from a standardized web UI framework.
Please let me know an elegant way that is included in the JSF specification without the need to implement another framework around.
Is there a way to decouple ManagedBeans from each other in a way that it is possible to send and receive custom events - probably over the (cool) FacesContext?!
Not without adding the event to a component, and you would have to add it before the Event phase of the JSF lifecycle.
I do not want to inject Beans as ManagedProperty, to reduce direct dependencies
Just because you are not injecting needed dependencies into your bean, doesn't mean that those dependencies wouldn't exist anyway if you are trying to go with an event driven model. At least by injecting the dependencies you explicitly declare what the managed bean depends on. This seems like a much more maintainable solution than what you are proposing.
Those of you who are familiar with Adobe Flex' event mechanism know what I mean and what I expect from a standardized web UI framework.
You expect a desktop based event driven model in a web application framework? This is apples to oranges. Adobe Flex is a Rich Internet Application that behaves like a desktop application while communicating with outside web services. JSF is a web application framework standard for web based components powered by javascript and ajax, with reusable server components and a server lifecycle which includes an event phase for components.
Please let me know an elegant way that is included in the JSF specification without the need to implement another framework around.
Your language implies that you do not find JSF elegant and that you are actively trying to make it something that it is not. Please do not do this, you will create a nightmare for yourself and anybody who has to maintain your application.
JSF requires a different way of thinking about web application development than what you are used to. If you find it this unpalatable then I suggest abandoning it for a web application framework that fits your comfortability level. You mentioned Flex, there is also Silverlight with .NET.

Ioc container placement within enterprise application

I've been looking into Ioc containers and AOP recently, and I'm pretty amazed by the concepts. I'm struggling however to decide how and where to implement the container.
The articles below suggest implementing the container in the 'application entry point':
Best Practices for IOC Container
IOC across visual studio projects?
Not understanding where to create IoC Containers in system architecture
Now - my thought-experiment application will consist of multiple visual studio projects ( one for data access, winforms application ). And let's say I want to use AOP for logging with Log4net, and so I setup log4net in the Ioc container.
So WinForms application in entry point, that's where Ioc container should go.
Here's the question: if I want to log stuff in my data access project/layer, should I add a
reference to my winforms application, get the ioc container from there, get the log4net instance out of it and use it for logging?
That would mean my data-layer depends on winforms application, that can't be right. How about I put the container is something like a 'Common' project within the solution. That way, all related projects (Data access/winformsa etc.) can access the container.
What is the right way to go here?
Your application's Composition Root would be the Windows Forms project. This is the only project which should have a reference to a DI Container.
In all other projects, dependencies should be injected via Constructor Injection. All decent DI Containers understand this pattern and use it to Auto-wire dependencies from the Composition Root.
I've abstracted my container into a separate assembly that all other assemblies / projects depending on its services reference. The container project has just a single class and - more or less - a single method:
public class MySpecialContainer
{
public T Resolve<T>() { // ... Get stuff from the IoC container }
}
The container build would either occur in MySpecialContainer's ctor or just add another method like Initialize() or some such.
The only problem is this approach broke down for me when I used Autofac and had both a Windows Service and ASP.Net project needing the container. Each had its specific requirement for scoped-lifetime services: Windows Service - PerLifetimeScope, ASP.Net - PerHttpRequest. I guess I could've passed in an argument into MySpecialContainer that denoted which scenario to configure for but I decided just to take on an Autofac dependency directly.
The good news is, if you stick to ctor injection, then you can very easily swap out various container implementations - Autofec, Ninject, StructureMap, etc.

Resources