Package diagram for an MVC patterned project? - model-view-controller

We are required to make a package diagram for our senior project. Since our project uses MVC patter design, we created an MVC class diagram, now, our problem is in creating the package diagrams from our class diagram.
Is it possible to have packages with MVC at the same time? so it would be something like this:
Package: Account
Account Model
Profile Controller
Registration Controller
Profile View
Registration View
These are the controller that cannot exists without the Account model, so I included them.
Thanks in advance!

Usually it goes vice versa - class after package.
Are you sure you need the package diagram? What you show, are not packages, but components. Packages are things that hold together only syntactically, and components are things that hold together really (not necessarily syntactically)
If yes, create packages according to the functionalities, as you named them.
As a stage two, create 3 subpackage for every package, according to the levels of MVC model. (sometimes there will be more or less than 3 of them)
Make 2 diagrams:
of large packages - there you can also show their info exchange on the conceptual level.
And of small packages - there put their names in form parentPackagename.thisPackageName. Here you can show the visibility levels of different info. Very probably, that on this level you'll need to divide the huge common diagram into understandable smaller ones.

Related

How should I architect multiple instances of a View, all with very similar ViewModels using Prism and Unity

I'm using XamarinForms, and Prism for MVVM with Unity as my IoC container.
I'm refactoring a tonne of duplicate View and ViewModel code that a previous employee worked on. What they've done is essentially cut and paste a massively complex View and ViewModel. I needed to change some things in these, and had to do it 5 times because of this spaghetti code. This is basically what the view looks like:
The 5 instances of these CardViews and their different Viewmodels have a main title, 3 labels and 3 values that go with the labels. The values are retrieved using various restful calls (all using various parameters to retrieve them and processing the results of the calls too).
My question is, what patterns should I be using to simplify these 5 Views and ViewModels, preferably into only 1 file that I need to change? Roughly speaking, I think I should have a ViewModel that offers all of the text, values and functions that I require to get this remote data. But I'm scratching my head at all the different approaches that it seems are available to me (ViewModel interfaces, dependency injection, methods of registering the template view with different instances of the same ViewModel, etc).
What complicates my problem in particular is that these 5 different views are split across Prism Modules (i.e. .NET projects). I understand the need for these but they seem to just add to the problem of duplicate code that largely does the same thing. Maybe these should contain the logic for the restful calls and processing? But how would that fit in with my ViewModel/View association problem from above?
I will continue researching the best way to do this, but I just wanted to know if anyone can steer me into the best practice direction?
If you have five views that look the same and five view models that do (nearly) the same thing, drop four of each. To account for the differences in the view models perform, create services.
The BookService might implement GetTitles by querying a rest service for book titles while the DvdService queries another rest service for dvd titles. Both of them implement ITitleService. Then the UnifiedViewModel is specialized for books by passing it a BookService (preferably through constructor injection) and thus it does exactly what the BookViewModel did before. Just now you have one view model and five services where before you had five view models.

How should I extract my existing code into a Laravel Plugin

I have made a tournament system in Laravel 5.3.
Now I want to extract the core ( generating trees ) into a plugin, and make it open source.
The idea is doing the plugin, and then replace the code in my app with the plugin's code, and all the references.
I'm making a demo, so people can migrate an seed necessary objects to generate his own tournament tree.
My main concern is that in my system, I have a lot of thing that should not belong to the plugin, but they still are in the same tables.
In my models, I removed a lot of fields / functions that are not necessary in the plugin.
For instance, In my tournament model, I have a function that handle permission, because not all users can "crud" all tournaments. This function has no place in my plugin, as I will not include any policy ( this should up to each use case )
Another example should be printing the tree. In my system, I allow user to print the tree, but in my plugin is meant to be the core functions, not optional stuff.
Also, I think I should use only 1 model, I mean, I should delete my project model, and use my plugin model as they will represent the same data. So what if I remove a field / method as mentioned previously???
For my models, a solution should be that I create child object and just extend my plugins models, but it would mean change all my actual model names, is it a good approach???
How should I manage migration??? Should I include useless fields in my plugin?
Also, which User model should I use, Xoco\my-plugin\User, or App\User

HMVC how to separate the modules?

I'm making a leave management (HRM) website. I'm using codeignitor HMVC to build this. Following features are included in this site:
A table to display a summary of leaves.
A table for leave types like annual, MC, urgent, other...
I was thinking to create two modules for leave_summary and leave_types, but my friend told me it is useless.
According to HMVC architecture we are trying to create self contained modules for reusability. If I'm creating a different module for leave types, I should be able to reuse it and module itself needs to be self containing. But I can't use leave_types module anywhere else.
My friend asked me to put all the leave related stuff in one module called leave. This sounds strange to me as I found lots of examples people are trying to separate things out.
Do we only need to separate the modules which can be reused in the future (ex: login module, image_gallery module, profile module) and keep all others things inside a one module?
(according to the above example I have to keep everything related to leave in a one module
ex: leave_type, leave_requests, leave_summary will be placed inside the leave module)
What are the benefits I will get, if I separate the leave_type, leave_requests, leave_summary etc... into separate modules?
Will I be able to reuse them? If so How?
In HMVC model classes and other assets can be exchanged among the modules, so how can I call it a self-contained module or a separate entity as it is depending on another module?
(ex: I have to call leave_type module's model class inside the leave_summary module to show the leave type name in a table.)
I'm little lost here. Please help me to understand. Thanks a lot!
As i work lot of MVC projects. And I am agree with your friend.
May times this question arise when i used join that i have to choose in which one module i should go for write query. If you write in one model may next developer will write in another one model.
So according me it is best to keep same type of tables which are handling relation and using for same behavior use this approach like leave model, profile model etc.

Business Layer structure, how do you build yours?

I am a big fan of NTiers for my development choices, of course it doesnt fit every scenario.
I am currently working on a new project and I am trying to have a play with the way I normally work, and trying to see if I can clean it up. As I have been a very bad boy and have been putting too much code in the presentation layer.
My normal business layer structure is this (basic view of it):
Business
Services
FooComponent
FooHelpers
FooWorkflows
BahComponent
BahHelpers
BahWorkflows
Utilities
Common
ExceptionHandlers
Importers
etc...
Now with the above I have great access to directly save a Foo object and a Bah object, via their respective helpers.
The XXXHelpers give me access to Save, Edit and Load the respective objects, but where do I put the logic to save objects with child objects.
For example:
We have the below objects (not very good objects I know)
Employee
EmployeeDetails
EmployeeMembership
EmployeeProfile
Currently I would build these all up in the presentation layer and then pass them to their Helpers, I feel this is wrong, I think the data should be passed to a single point above presentation in the business layer some place and sorted out there.
But I'm at a bit of a loss as to where I would put this logic and what to call the sector, would it go under Utilities as EmployeeManager or something like this?
What would you do? and I know this is all preference.
A more detailed layout
The workflows contain all the calls directly to the DataRepository for example:
public ObjectNameGetById(Guid id)
{
return DataRepository.ObjectNameProvider.GetById(id);
}
And then the helpers provider access to the workflows:
public ObjectName GetById(Guid id)
{
return loadWorkflow.GetById(id);
}
This is to cut down on duplicate code, as you can have one call in the workflow to getBySomeProperty
and then several calls in the Helper which could do other operations and return the data in different ways, a bad example would be public GetByIdAsc and GetByIdDesc
By seperating the calls to the Data Model by using the DataRepository, it means that it would be possible to swap out the model for another instance (that was the thinking) but ProviderHelper has not been broken down so it is not interchangable, as it is hardcode to EF unfortunately.
I dont intend to change the access technology, but in the future there might be something better or just something that all the cool kids are now using that I might want to implement instead.
projectName.Core
projectName.Business
- Interfaces
- IDeleteWorkflows.cs
- ILoadWorkflows.cs
- ISaveWorkflows.cs
- IServiceHelper.cs
- IServiceViewHelper.cs
- Services
- ObjectNameComponent
- Helpers
- ObjectNameHelper.cs
- Workflows
- DeleteObjectNameWorkflow.cs
- LoadObjectNameWorkflow.cs
- SaveObjectNameWorkflow.cs
- Utilities
- Common
- SettingsManager.cs
- JavascriptManager.cs
- XmlHelper.cs
- others...
- ExceptionHandlers
- ExceptionManager.cs
- ExceptionManagerFactory.cs
- ExceptionNotifier.cs
projectName.Data
- Bases
- ObjectNameProviderBase.cs
- Helpers
- ProviderHelper.cs
- Interfaces
- IProviderBase.cs
- DataRepository.cs
projectName.Data.Model
- Database.edmx
projectName.Entities (Entities that represent the DB tables are created by EF in .Data.Model, this is for others that I may need that are not related to the database)
- Helpers
- EnumHelper.cs
projectName.Presenation
(depends what the call of the application is)
projectName.web
projectName.mvc
projectName.admin
The test Projects
projectName.Business.Tests
projectName.Data.Test
+1 for an interesting question.
So, the problem you describe is pretty common - I'd take a different approach - first with the logical tiers and secondly with the utility and helper namespaces, which I'd try and factor out completely - I'll tell you why in a second.
But first, my preferred approach here is pretty common enterprise architecture which I'll try to highlight in brief, but there's much more depth out there. It does require some radical changes in thinking - using NHibernate or Entity framework to allow you to query your object model directly and let the ORM deal with things like mapping to and from the database and lazy loading relationships etc. Doing this will allow you to implement all of your business logic within a domain model.
First the tiers (or projects in your solution);
YourApplication.Domain
The domain model - the objects representing your problem space. These are plain old CLR objects with all of your key business logic. This is where your example objects would live, and their relationships would be represented as collections. There is nothing in this layer that deals with persistence etc, it's just objects.
YourApplication.Data
Repository classes - these are classes that deal with getting the aggregate root(s) of your domain model.
For instance, it's unlikely in your sample classes that you would want to look at EmployeeDetails without also looking at Employee (an assumption I know, but you get the gist - invoice lines is a better example, you generally will get to invoice lines via an invoice rather than loading them independently). As such, the repository classes, of which you have one class per aggregate root will be responsible for getting initial entities out of the database using the ORM in question, implementing any query strategies (like paging or sorting) and returning the aggregate root to the consumer. The repository would consume the current active data context (ISession in NHibernate) - how this session is created depends on what type of app you are building.
YourApplication.Workflow
Could also be called YourApplication.Services, but this can be confused with web services
This tier is all about interrelated, complex atomic operations - rather than have a bunch of things to be called in your presentation tier, and therefore increase coupling, you can wrap such operations into workflows or services.
It's possible you could do without this in many applications.
Other tiers then depend on your architecture and the application you're implementing.
YourApplication.YourChosenPresentationTier
If you're using web services to distribute your tiers, then you would create DTO contracts that represent just the data you are exposing between the domain and the consumers. You would define assemblers that would know how to move data in and out of these contracts from the domain (you would never send domain objects over the wire!)
In this situation, and you're also creating the client, you would consume the operation and data contracts defined above in your presentation tier, probably binding to the DTOs directly as each DTO should be view specific.
If you have no need to distribute your tiers, remembering the first rule of distributed architectures is don't distribute, then you would consume the workflow/services and repositories directly from within asp.net, mvc, wpf, winforms etc.
That just leaves where the data contexts are established. In a web application, each request is usually pretty self contained, so a request scoped context is best. That means that the context and connection is established at the start of the request and disposed at the end. It's trivial to get your chosen IoC/dependency injection framework to configure per-request components for you.
In a desktop app, WPF or winforms, you would have a context per form. This ensures that edits to domain entities in an edit dialog that update the model but don't make it to the database (eg: Cancel was selected) don't interfere with other contexts or worse end up being accidentally persisted.
Dependency injection
All of the above would be defined as interfaces first, with concrete implementations realised through an IoC and dependency injection framework (my preference is castle windsor). This allows you to isolate, mock and unit test individual tiers independently and in a large application, dependency injection is a life saver!
Those namespaces
Finally, the reason I'd lose the helpers namespace is, in the model above, you don't need them, but also, like utility namespaces they give lazy developers an excuse not to think about where a piece of code logically sits. MyApp.Helpers.* and MyApp.Utility.* just means that if I have some code, say an exception handler that maybe logically belongs within MyApp.Data.Repositories.Customers (maybe it's a customer ref is not unique exception), a lazy developer can just place it in MyApp.Utility.CustomerRefNotUniqueException without really having to think.
If you have common framework type code that you need to wrap up, add a MyApp.Framework project and relevant namespaces. If your're adding a new model binder, put it in MyApp.Framework.Mvc, if it's common logging functionality, put it in MyApp.Framework.Logging and so on. In most cases, there shouldn't be any need to introduce a utility or helpers namespace.
Wrap up
So that scratches the surface - hope it's of some help. This is how I'm developing software today, and I've intentionally tried to be brief - if I can elaborate on any specifics, let me know. The final thing to say on this opinionated piece is the above is for reasonably large scale development - if you're writing notepad version 2 or a corporate phone book, the above is probably total overkill!!!
Cheers
Tony
There is a nice diagram and a description on this page about the application layout, alhtough looks further down the article the application isnt split into physical layers (seperate project) - Entity Framework POCO Repository

Visual Studio code generation - how to deal with developers editing class files

So thanks to the Visualization and Modeling Feature Pack , I can build a uml model diagram and generate a bunch of classes.
But what now? Presumably, my developers will add code to those classes. Useful code, valuable code, and as the templates themselves indicate:
// Changes to this file will be lost if the code is regenerated.
So what is the best solution here? Can I make the modeling project reflect changes to the actual classes? Should I generate partial classes? Modify the default templates to read class files and not auto-generate anything that has been modified? Should I tell developers not to edit model files under pain of....well, pain?
Thanks for the tips.
As far as I know, this is really the key reason for partial classes in the first place. The custom code goes in one file, the auto-generated in another.
You could also create classes derived from the generated ones, and put any changes in there. I also agree with above poster that partial classes could be the way to go.
Although the tools generate basic skeleton classes out of the box, that's really just a starting point. You can easily adapt the generator templates to create your own stuff. Different people want to generate different code from the classes - some even generate XML or SQL. And yep, in C#, partial classes are good to generate, so's to keep the hand-written code separate from the generated bits.
It's good to put lots of extension points in the generated code, where you fill in the details by hand code.
Another neat idea is "double derived": from each UML class, generate a base class and a derived class. The derived one has only constructors. The base class has any methods you generate. So your hand code can easily override generated methods where you need that.
There are several options in the tool and recommending what is best is hard without knowing your scenario. Partial classes are great for some, but not all applications. If you want your UML class to generate a partial class, you can set it's C# stereotype's property to "Partial" and it will do so, and custom code can then be added in a partial class that won't be overwritten. If you want to prevent code from being overwritten, you can do this by setting the overwrite property to False on the template binding that corresponds to the package you are working on. This lets you set your extension code to be in a package that is not overwritten, while your model mastered code is overwritten with the latest model changes. Finally, if you want your code to be the master for your model so it always reflects the latest code, then you can reverse engineer your code by using the architecture explorer to select your classes and then dragging them in to a UML diagram. So for a given gesture, either the model is the master or the code is the master. In this version, we did not implement automated merge capabilities between the two.

Resources