Using Alloy Models - models

I'm working on a project about the live upgrade of HA applications in SA
Forum middleware.
in Part of my research, I need to make a UML profile for my input upgrade campaign file,
and validate that file regarding some dependency constraints. Now I want to use ALLOY
instead of UML in my work specially since it's more abstract and formal than UML. (of
course UML + OCL will be formal.). Now my question is that, if UML + OCL is formal so
what's the benefit of using the ALLOY?
In general what are the benefits of using Alloy against UML?

As far as I know, there are no tools that let you check your OCL constraints against the UML
model, and generate and visualize valid instances, so if you are planning to do formal analysis of your models + specifications, Alloy might be a better choice. Even if you're not planning to do much of analysis, Alloy's ability to generate and visualize valid instances is greatly helpful in making sure you got your model and specification right.

Related

Model Domain separation and Automapper in F#

I’m new to F# but would like to start building an API with it.
I have previously built C# API’s. In those we layer the architecture between models (being in the controllers), the domain DTO’s in the services and finally in the data layer if there needs to be any persistence.
Between these layers we use Automapper to map the types up and down the layers keeping a clean separation, for example none of our domain objects can have anything null in them (we use a maybe/option) and cater for this in our mappings between the objects.
I have seen examples of building API’s with F# and lots of getting started, but no good examples on architecture. I have also seen people try to use Automapper with F# but it seems shoehorned in and you don't see anyone use it in any of the example tutorials on the net, at least that I have found.
I’m just wondering what people do. Do you use something like Automapper in F#, do you keep a separation between your models and domain DTO’s? do you care? Am I thinking of this in too much of an oop way?
The closest I have come to any help has been on https://fsharpforfunandprofit.com/posts/recipe-part3/
I’m wondering what people do. Do you keep a separation between your models and domain DTO?
It's not that common but we prefer to keep everything manual. We don't create DTOs.
If we want to transfer objects we serialize and deserialize manually without creating additional objects.
DTOs are often used as an aid to serialization, but then you have to do just as much work to write the object <-> DTO mapping and end up with slow, reflection-based, unoptimized serialization.

show events and GUI classes in diagram class

i am currently working on a project and need to create some uml diagram properly in order to put them in my final report.
so i'm wondering if i should put the GUI classes in the diagram ? in that case, there would be a lot of informations and i fear this would make it harder to read it !
in addition to that, as i'm using observers in my program, how can i proceed to add the Event classes that i used ? i found a similar question here :
Typical uml diagram for showing events
and i dont know if it is the right way or not !
Thank you for your answers
Class Diagram are typically used to show interaction between classes with relationship (Cardinality). It is bit like ER Diagram. For your case, If you want to show control flow between different classes with event in time ordering manner then it is better to use Sequence Diagram and if you want to show the relationship between different class with participation of one's object on another object then uses Class Diagram. Both diagram will certainly add some standard on your final report.
I had used Argo UML (free and open source software) to create those diagram. But various other tool like Rational Rose, violet or visual Studio-2010... are also available.

Is it possible to use the Presentation Model Pattern for a GEF-based RCP app with an EMF domain model?

I'm working on an eclipse RCP application, using a third-party domain model based on EMF, and a GEF editor for editing.
GEF uses the MVC pattern, which would be fair enough if I didn't have to use a specific layout for drawing the model graph on the editor view. The domain model I am using includes no visual information whatsoever (which in itself is a good idea), but I'd like to be able to assign coordinates to Figures in their EditParts. This would make it much easier for me to calculate the position of the figure in the layout.
Now I have stumbled upon the Presentation Model Pattern by Martin Fowler, which seems just about the thing I was looking for. I have also found a - old-ish - tutorial on RCP UI testing (German only), which uses this pattern in an eclipse RCP context.
Now I'm wondering: is it generally possible to use PM in a GEF context, seeing that GEF explicitly uses MVC? Is MVVM an alternative?
Please note that I am prevented from using GMF for a number of reasons.
Many thanks!
Yes, it's definitely possible, and you have two choice.
First - is implement you own graphical notation model. I would suggest you using appreach like:
modelElement : ModelElement 1..1
x : int 1..1
y : int 1..1
Then load two models in EditingDomain (EMF will resolve cross-document references for you), create all missing Graphical Notation Elements e.t.c...
The other option is to use GMF or Graphiti. They have the model you're looking for out of the box, which will greatly simplify you life. At the cost of learning yet-another-monster-framework (in case of GMF). Graphiti, is easy (relative to GEF/GMF), but IMO is's less flexible. GMF, btw, will give you a 'free' TransactionalEditingDomain, which will handle all the commands, undos and redos for you. So, as in comments to you previous question, I would suggest you using GMF.
Oh, sorry, I didn't noticed what you wrote about GMF.
Then, the second option is to have Graphical Model inherit from Domain Model, and then code you GEF editor against this model.

One Model to Rule Them All - VS2010 UML, ADO.NET Entity Data Model, and T4

I worked on a fairly large project a while back where we modeled the classes in Enterprise Architect and generated the (partial) POCO classes (complete with model-driven business rule validations), persistence (NHibernate mapping file) and DDL. Based on certain model attributes we could flag alternate generation strategies or indicate that a particular portion would be entirely hand-coded.
There was a good deal of initial investment, but it paid large dividends over the lifetime of a 15 developer, 3 year project.
I'm investigating doing something similar with the current Microsoft technology stack. The place I'm stuck is that class modeling is done with the VS 2010 UML tools, but logical data modeling is done with Entity Data Modeler.
Is it a reasonable path to use VS 2010 UML as the "single source of truth" and code generate the edmx files based on the class model? That's the inverse of the common path to create the entity model and use a POCO generator to generate classes. However, a good class model can be used to generate much more than just the properties so I tend to view it as a better choice than the entity model.
Entity Data Modeler is limited to a single diagram per model and becomes unusable in non-trivial scenarios. You can use UML profiles to extend class models for logical data modeling. It requires a significant investment of effort and time which may be justified on a 3-year 15-developer project.
It's always going to be a problem, as each modeling layer maps two disparate worlds. To have fully aware code, your generation system must have access to all mapping models. IOW, you can't simply declare one to be the "master", as each layer is a "real" perspective of the solution.
Yes, this is possible. No, there is nothing built in. To do this you'd need to write a VSIX which would consume the model and emit EDMX/code. This isn't necessarily hard, but you'd have to do it yourself. You'd also need a pattern or attributes for handling the modeling aspects which you might not have in your diagrams, just like you have to do for specifying key fields and the like when doing code-first modeling.

Persistence framework?

I'm trying to decide on the best strategy for accessing the database. I understand that this is a generic question and there's no a single good answer, but I will provide some guidelines on what I'm looking for.
The last few years we have been using our own persistence framework, that although limited has served as well. However it needs some major improvements and I'm wondering if I should go that way or use one of the existing frameworks. The criteria that I'm looking for, in order of importance are:
Client code should work with clean objects, width no database knowledge. When using our custom framework the client code looks like:
SessionManager session = new SessionManager();
Order order = session.CreateEntity();
order.Date = DateTime.Now;
// Set other properties
OrderDetail detail = order.AddOrderDetail();
detail.Product = product;
// Other properties
// Commit all changes now
session.Commit();
Should as simple as possible and not "too flexible". We need a single way to do most things.
Should have good support for object-oriented programming. Should handle one-to-many and many-to-many relations, should handle inheritance, support for lazy loading.
Configuration is preferred to be XML based.
With my current knowledge I see these options:
Improve our current framework - Problem is that it needs a good deal of effort.
ADO.NET Entity Framework - Don't have a good understanding, but seems too complicated and has bad reviews.
LINQ to SQL - Does not have good handling of object-oriented practices.
nHibernate - Seems a good option, but some users report too many archaic errors.
SubSonic - From a short introduction, it seems too flexible. I do not want that.
What will you suggest?
EDIT:
Thank you Craig for the elaborate answer. I think it will help more if I give more details about our custom framework. I'm looking for something similar. This is how our custom framework works:
It is based on DataSets, so the first thing you do is configure the
DataSets and write queries you need there.
You create a XML configuration file that specifies how DataSet tables map to objects and also specify associations between them (support for all types of associations).
3.A custom tool parse the XML configuration and generate the necessary code.
4.Generated classes inherit from a common base class.
To be compatible with our framework the database must meet these criteria:
Each table should have a single column as primary key.
All tables must have a primary key of the same data type generated on the
client.
To handle inheritance only single table inheritance is supported. Also the XML file, almost always offers a single way to achieve something.
What we want to support now is:
Remove the dependency from DataSets. SQL code should be generated automatically but the framework should NOT generate the schema. I want to manually control the DB schema.
More robust support for inheritance hierarchies.
Optional integration with LINQ.
I hope it is clearer now what I'm looking for.
Improve our current framework - Problem is that it needs a good deal of effort
In your question, you have not given a reason why you should rewrite functionality which is available from so many other places. I would suggest that reinventing an ORM is not a good use of your time, unless you have unique needs for the ORM which you have not specified in your question.
ADO.NET Entity Framework
We are using the Entity Framework in the real world, production software. Complicated? No more so than most other ORMs as far as I can tell, which is to say, "fairly complicated." However, it is relatively new, and as such there is less community experience and documentation than something like NHibernate. So the lack of documentation may well make it seem more complicated.
The Entity Framework and NHibernate take distinctly different approaches to the problem of bridging the object-relational divide. I've written about that in a good bit more detail in this blog post. You should consider which approach makes the most sense to you.
There has been a great deal of commentary about the Entity Framework, both positive and negative. Some of it is well-founded, and some of the seems to come from people who are pushing other solutions. The well-founded criticisms include
Lack of POCO support. This is not an issue for some applications, it is an issue for others. POCO support will likely be added in a future release, but today, the best the Entity Framework can offer is IPOCO.
A monolithic mapping file. This hasn't been a big issue for us, since our metadata is not in constant flux.
However, some of the criticisms seem to me to miss the forest for the trees. That is, they talk about features other than the essential functionality of object relational mapping, which the Entity Framework has proven to us to do very well.
LINQ to SQL - Does not have good handling of object-oriented practices
I agree. I also don't like the SQL Server focus.
nHibernate - Seems a good option, but some users report too many archaic errors.
Well, the nice thing about NHibernate is that there is a very vibrant community around it, and when you do encounter those esoteric errors (and believe me, the Entity Framework also has its share of esoteric errors; it seems to come with the territory) you can often find solutions very easily. That said, I don't have a lot of personal experience with NHibernate beyond the evaluation we did which led to us choosing the Entity Framework, so I'm going to let other people with more direct experience comment on this.
SubSonic - From a short introduction, it seems too flexible. I do not want that.
SubSonic is, of course, much more than just an ORM, and SubSonic users have the option of choosing a different ORM implementation instead of using SubSonic's ActiveRecord. As a web application framework, I would consider it. However, its ORM feature is not its raison d'être, and I think it's reasonable to suspect that the ORM portion of SubSonic will get less attention than the dedicated ORM frameworks do.
LLBLGen make very good ORM tool which will do almost all of what you need.
iBATIS is my favourite because you get a better grain of control over the SQL
Developer Express Persistence Objects or XPO as it is most known. I use it for 3 years. It provides everything you need, except that it is commercial and you tie yourself with another (single company) for your development. Other than that, Developer Express is one of the best component and framework providers for the .NET platform.
An example of XPO code would be:
using (UnitOfWork uow = new UnitOfWork())
{
Order order = new Order(uow);
order.Date = DateTime.Now();
uow.CommitChanges();
}
I suggest taking a look at the ActiveRecord from Castle
I don't have production experience with it, I've just played around with their sample app. It seems really easy to work with, but I don't know it well enough to know if it fits all your requirements

Resources