I am looking for guidance on how to model and create FHIR like resources (custom resources) that are not defined with FHIR( not published resources) - the reasoning is to have all FHIR and non FHIR data in a single persistence storage rather than having them in two separate data stores - I know that I can use the domain resource to create these resources via means of inheritance but this will require compilation / deployment for each new data model. The question is there a way to do the above during runtime by using (POST of a structure definition)
Hapi doesn't support custom resources "on-the-fly" out of the box. You would need to create model classes, DAOs, and Resource Providers during runtime and register these.
Related
Mule got inbuild object stores to cache data, But what is the purpose of using custom object stores under caching strategies? If possible, Please mention a usecase.
Custom object stores are useful when you want to use a custom persistence mechanism for your ObjectStore's
By default Mule provides two implementations, in-memory and a file based persistent store.
One possible use-case is, if you use Enterprise Edition have clustering enabled, you also have the ability to share these stores across multiple Mule nodes. However if you do not use EE or clustering but still want to share data between multiple Mule's, then you will need to use a persistent object store that can be shared across multiple mule nodes.
The ObjectStore interface has many implementations so you can choose a persistence mechanism that best suits you. Examples include Redis, Ehcache, Mongo, Cassandra, JDBC. More on this here:http://java.dzone.com/articles/synchronizing-data-across-mule
Ryan has given the correct explanation.. I just want to add:-
Mule uses object stores whenever it needs data to persist for later retrieval.
Custom Object store can be configure using Java classes and you can customize the Object store as per your need..
You can customize it and control your Cache and the Cache Keys, as well as store and retrieve the data, log you Cache keys and Cache contents, list you Cache keys etc from the Java class and that means full control on the Custom Object store ..
Please go through the following links :-
http://ricston.com/blog/cache-scope-ehcache/
http://java.dzone.com/articles/cache-scope-ehcache
http://www.mulesoft.org/documentation/display/current/Mule+Object+Stores
I'm currently working laravel5 for CRM application with the help of Repository method under Provider Directory. But i'm totally look blind to understand Service Directory, and purpose of this directory.
And can anyone give me example to utilize these directories by differentiate difference.
Services
Services are re-usable classes that do not belong in a controller. For example, a service which is required by more than one controller such as class for building site navigation. It is a good place to put "global" classes (global to your app), which can be injected into a controller for use across your application.
Providers
Providers inject services into the dependency injection system, making them easier to access across the application. Packages which are Laravel specific usually include a service provider, which ensures that the packages classes are loaded when required and available to your controllers.
Services (http://laravel.com/docs/5.0/structure)
The Services directory contains various "helper" services your application needs to function. For example, the Registrar service included with Laravel is responsible for validating and creating new users of your application. Other examples might be services to interact with external APIs, metrics systems, or even services that aggregate data from your own application.
Providers
The Providers directory purpose is basically binding the custom files with app, for example if we want to work with the repository pattern and use eloquent instead of write queries in models then we need to bind our repositories with service providers and register service provider in to config/app.php file.
I have a project in which I am using NHibernate and ASP.Net MVC. The application is intended to allow users to track certain data and then produce views of statistics based upon the data entered. The structure of my application thus far looks something like this:
NHibernate Layer: Contains Repository<T> and UnitOfWork classes, as well as entity mapping definitions.
Core/Service Layer: Contains generic EntityService class. At the moment, this simply defines transaction scope via IUnitOfWork and interfaces with IRepository to provide higher-level data access services.
Presentation Layer (MVC Application): Not yet implemented, but contains the usual stuff plus dependency injection.
I have a couple of questions:
Is it poor design to allow my MVC application to handle dependency injection for ALL layers? For example, as well as dependency injection of EntityService instances into controllers, it will handle the dependency injection of IRepository into the EntityService classes. Should the service layer handle this itself, even though this would mean performing dependency injection in two distinct places?
Where should I produce my statistics? This business logic doesn't seem to belong in my service layer, which, at present, only contains entity type definitions and an interface for modifying and accessing entity properties. I have a few thoughts on this, but I'm not sure which I like best:
Keep my service layer as is and create a separate Statistics project - this is completely independent of the entity types for which it will be used, meaning my MVC controllers will have to pass raw numerical information between my business entities and my (presumably static) statistics classes. This is quite a neat separation but potentially means a lot of business logic still remaining in the presentation layer.
Create a Statistics project; however, create a tight coupling between the classes in this project and my business entities. For example, instead of passing a Reading object's values into a method, I will pass the entire object (or define them as extension methods). This will shift business logic out of my MVC app but the tight coupling seems a bit messy.
Keep all of my business logic inside my service layer. Define strongly-typed subclasses of EntityService, so my services contain both entity-specific business methods and data storage methods, while keeping the entity classes themselves as pure data containers. Create a separate Statistics project for any generic statistical processing and call its methods via my derived service classes. My service classes effectively merge business functions with the storage functionality provided created by IRepository<T>.
I am erring toward the third option but does anyone have any thoughts? Alternative suggestions?
Thanks in advance!
Preliminary observation:
I like the way in which you described your project, I just didn't get why your Data Access Layer (DAL) is called NHibernate Layer: it is odd with all the rest in which you didn't use technology name to describe a logical layer (correctly). So I suggest you to rename it DAL, and use it to abstract your app from NHibernate.
My opinions about your questions:
Absolutely no. It is good to apply Dependency Injection to All Layers. A couple or reasons for which it is good:
1.1 Testing: you can mock DAL interfaces and do unit test Service Layer w/o DAL using another DI config file. In the same way you can mock Service for Web Controllers layer and so on.
1.2 Different DAL implementations: suppose you need different DAL implementation (NOSQL, SQL or LINQ instead of NHibernate, etc..) technologies for different deployment of you project or to scale in the future. You can do that easily maintaining different DI config files.
You can have the same layer deployed in different projects. In the same way you can have a project containing different layers. I think their relation is orthogonal: project is describing a physical (development time and run time) implementation. Layers are logical. So initially I would keep it simple with the third option.
I just don't understand why you saying the following regarding this option:
Create a separate Statistics project for any generic statistical
processing and call its methods via my derived service classes. My
service classes effectively merge business functions with the storage
functionality provided created by IRepository.
I see Statistics as one or more services so you can implement it as namespace with classes inside your Service Layer. And, as any other service, you can inject DAL Repository classes. And, as any other Service/DAL, the Model classes can be shared between different Services and DAL classes.
StatsService.AverageReadingFor(Person p, DateTime start, DateTime end) sounds good.
There are several implementation options:
Using underlying repository features (for example: SQL avg function)
Using Observer Pattern which is implementable also using Dependency Injection
Using Aspect Oriented Programming. See that Spring.Net chapter as an example.
If you have more than one Service Layer instance (more than one server) than 2 and 3 must be adapted for out of process communication using a messaging system.
Just an update - Regarding my second question, I have decided to define an IStatsService<T> which expects an IEntityService<T> to be passed into its constructor. I'll use this for generic statistical processing of business entities and create further interfaces that implement IStatsService<T> where I need more type-specific information.
Hopefully this will help someone who has been scratching their head about a similar problem!
I will start to code a new Web application soon. The application will be built using ASP.Net MVC 3 and Entity Framework 4.1 (Database First approach). Instead of using the default EntityObject classes, I will create POCO classes using the ADO.NET POCO Entity Generator.
When I create POCOs using this tool, it automatically adds the Virtual keyword to all properties for change tracking and navigation properties for lazy loading.
I have however read and seen from demonstrations, that Julie Lerman (EF Guru!) seems to turn off lazy loading and also modifies her POCO template so that the Virtual keyword is removed from her POCO classes. Julie states the reason why she does this is because she is writing applications for WCF services and using the Virtual keyword with this causes a Serialization issue. She says, as an object is getting serialized, the serializer is touching the navigation properties which then triggers lazy loading, and before you know it you are pulling the whole database across the wire.
I think Julie was perhaps exagarating when she said this could pull the whole database across the wire, however, even so, this thought scares me!
My question is (finally), should I also remove the Virtual keyword from my POCO classes for my MVC application and use DectectChanges for my change tracking and Eager Loading to request navigation properties.
Your help with this would be greatly appreciated.
Thanks as ever.
Serialization can indeed trigger lazy loading because the getter of the navigation property doesn't have a way to detect if the caller is the serializer or user code.
This is not the only issue: whether you have virtual navigation properties or all properties as virtual EF will create a proxy type at runtime for your entities, therefore entity instances the serializer will have to deal with at runtime will typically be of a type different from the one you defined.
Julie's recommendations are the simplest and most reasonable way to deal with the issues, but if you still want to work with the capabilities of proxies most of the time and only sometimes serialize them with WCF, there are other workarounds available:
You can use a DataContractResolver to map the proxy types to be serialized as the original types
You can also turn off lazy loading only when you are about to serialize a graph
More details are contained in this blog post: http://blogs.msdn.com/b/adonet/archive/2010/01/05/poco-proxies-part-2-serializing-poco-proxies.aspx
Besides this, my recommendation would be that you use the DbContext template and not the POCO template. DbContext is the new API we released as part of EF 4.1 with the goal of providing greater productivity. It has several advantages like the fact that it will automatically perform DetectChanges so that you won't need in general to care about calling the method yourself. Also the POCO entities we generate for DbContext are simpler than the ones that we generate with the POCO templates. You should be able to find lots of MVC exampels using DbContext.
Well it depends on your need, if you are going to serialize your POCO classes than yes you should remove them (For example: when using WCF services or basically anything that will serialize your entire object). But if you are just building a web app that needs to access your classes than I would leave them in your classes as you control the objects that you will access in your classes through your code.
I got following scenario. A web part needs certain configuration parameters (primitive data types) like e.g. an URL (string) to retrieve and show data from an external system. As each instance of the web part within a web application should retrieve the data from the same system, the parameters are stored in the SPPropertyBag of the web application so the web part knows where to look for it. The parameters are put to the property bag via an application page in the CA.
At the moment the web part uses a configuration object which implements the singleton pattern to access the configuration parameters stored in the property bag. The disadvantage is that the web part won't recognize a change of the configuration parameters until the application pool is reseted and the singleton object is newly created with the updated parameters.
Now I'm looking for a way to optimize this mechanism in such a way that the singleton object is able to recognized configuration changes and reread the parameters without killing the application pool.
I thought about some kind of caching mechanism which somehow informs the singleton object that the parameters have changed. I've read some articles about cache dependencies which might be a way to go but I'm not sure how to use them with SPPropertyBag objects.
So I'm wondering how you would handle this?