NHibernate Performance on an internet banking application - performance

Currently we have a project to implement an Internet Banking site, and we are evaluating using Nhibernate on it. ¿Is NHibernate suitable for this kind of application, where performance is important and there will be a large quantity of users doing operations simultaneously?
¿Do you know any successfull stories of using NHibernate in this kind of environment?
I think NHibernate is slow only when is used incorrectly, and I think we can use it with a lot of tweaking, best practices and common sense.
UPDATE: We were contacted for the project not too long ago, and we are still collecting requirements to define the specs. The application its for a small to medium bank in our country, so they expect around a 200 - 300 users as a top simultaneously.
Im pretty sure the DB will be in SQL Server 2005, and will be a n-tier application using webservices to access the data layer.

My team has been using NHibernate in a system requiring high throughput for years without a problem. NH is fairly efficient to begin with, and provides fine-grained control over when and how objects are reconstituted.
With that said, we don't know the specifics of your problem, so we can't make certain predictions. Perform scaling tests before you commit yourself.

NHibernate can be suitable if used correctly.
But, don't pin you down on this answer, since we do not know the correct specs.

Related

Couchbase and NoSQL DBs for a serious Web-Application?

I read some posts here, but a real answer I didnt find.
Normally I work and worked with normal SQL Databaeses (MS SQL, MySQL), when I developed applications (ERP, CRM, PPS, Web Shops etc.). A real contact/experience with document-oriented databases in real business was not possible.
Only in a private sector (hobby, experimental projects) I tested MongoDB and CouchDB. The experience was good, but not enough to say "Yes, let it use for business!", because I could not test it in a real environment.
But now, there is a chance to program from zero, which could be a big start for a business.
So my question:
Can I use Couchebase for a big business application, where thousands users would use it. Is it so fast and with good performance to handling thousnds of queries, requests/reposts etc.?
How looks like with backup and restore?
Where is the limit of couchbase?
Thank you for the anwser.
In short, yes.
Your questions are too broad to fully address here. Couchbase has many real-world installations with clients doing production work at large scale. You can see several references with write ups of their uses on the Couchbase site. (Note this is not a complete list of customers, only ones that have agreed to have their use highlighted.) You will definitely recognize some names.

Designing XPages applications for large user populations and high performance

The following questions were posed by a customer who is about to write a large scale XPages application. While I think the questions are actually to broad to fit stackoverflow style, they are interesting and the collective knowledge of the experts here could yield better results than one person answering them:
How many concurent users can use XPages applications on 1 Lotus Domino server (There are several applications on Lotus Domino server. Not one)?
How can we define and analyze memory leaks on Lotus Domino server, when run XPages application?
How can we write XPages the right way for achiving the best performance and avoding memory leaks?
What code methods and objects should not be used?
What are typical errors when the Lotus Script developer begins to write the code for XPages? What are the best practises?
How can we build centralized, consolidated application on XPages for 10000 - 15000 users? How many servers we need? How to configure XPages application in that case?
How to balace users?
I will provide my insights, please share yours
How long is a string? It depends on how the server is configured. And "application" could be a single form or hundreds. Only a test can tell. In general: build a high performance server preferably with 64Bit architecture and lots of RAM. Make that RAM available for the JVM. If the applications use attachments, use DAOS, put it on a separate disk - and of course make sure you have the latest version of Domino (8.5.3FP1 at time of this writing)
There is the XPages Toolbox that includes a memory and CPU profiler.
It depends on the type of application. Clever use of the scopes for caching, Expression Language and beans instead of SSJS. You leak memory whey you forget .recycle. Hire an experienced lead developer and read the book also the other one and two. Consider threading off longer running code, so users don't need to wait.
Depends on your needs. The general lessons of Domino development apply when it comes to db operations, so FTSearch over DBSearch, scope usage over #DBColumn for parameters. EL over SSJS.
Typical errors include: all code in the XPages -> use script libraries. Too much #dblookup, #dbcolumn instead of scope. Validation in buttons instead of validators. Violation of decomposition principles. Forgetting to use .recycle(). Designing applications "like old Notes screens" instead of single page interaction. Too little use of partial refresh. No use of caching. Too little object orientation (crating function graves in script libraries).
This is a summary of question 1-5, nothing new to answer
When clustering Domino servers for XPages and putting a load balancer in front, the load balancer needs to be configured to keep a session on the same server, so partial refreshes and Ajax calls reach the server that has the component tree rendered for that user.
It depends on the server setup, I have i.e XPage extranet with 12000 registered users spanning over aprox 20 XPage applications. That runs on 1 Windows 2003 server with 4GB Ram and quad core cpu. Data aount is about 60GB over these 20 applications. No Daos, no beans just SSJS. Performance is excellent. So when I upgrade this installation to 64 bit and Daos the application will scale when more. So 64Bit and Lots of Ram is the key to alot of users.
I haven't done anything around this
Make sure to recyle when you do document loops, Use the openntf.org debug toolbar it will save alot of time before we have a debugger for XPages.
Always think when you are doing things this will be done by several users, so try to cut down number of lookup or getElementByKey. Try to use ViewNavigator when you can.
It all depends on how many users that uses the system concurrent. If you have 10000 - 15000 users concurrent then you have to look at what the applications does and how many users will use the same application at the same time.
Thats my insights into the question

web development - MVC and it's limitations

MVC sets up clear distinction between Model, View and Controller.
For the model, now adays, web frameworks provides ability to map the model directly to database entities (ORM), which, IMHO, end up causing performance issues at runtime due to direct database I/O.
The thing is, if that's really the case, why model ORM is so pupular and every web frameworks want to support it either organically or not.
To a web site has huge amount of traffic, it definitely won't work. But what's the work around? Connect directly to database is definitely not a wise solution here.
What's your question?
Is it a good idea to use direct db access from webpages?
A: No.
Is it a good idea to use ORM's?
A: Debatable : See How can I design a Java web application without an ORM and without embedded SQL
Is it a good idea to use MVC model?
A: Yes - it has nothing to do with "Direct" database access - it's about separating your application logic from your model and your display. (Put simply).
And the rationale for not putting database logic inside webpages has nothing to do with performance - it's about security/maintainability etc etc. Calling a usp from a webpage is likely to be MORE performant than using an ORM, but it's bad because the performance gain is negligible, and the cons are significant.
As to workaround: if you mean how do you hook up a database to a web application...?
The simplest way is to use something like Entity Frameworks or Linq-Sql with your Model - there are plenty of examples of this in tutorials on the web.
A better method IMO, is to have a separate Services layer (which may be WCF based), and have all the database access inside that, with DTO's transferring the data to your Web Application which has it's own ViewModel.
Mvc is not about orm but about separation of display logics and business logics. There is no reason your exposed model needs to be identical to you database model and many reasons to ensure that the exposed model closely matches what is to be displayed.
The other part of the solution to scale well would be to implement caching in the control and be able to distribute load on sevaral instances.
I think #BonyT has given a good answer, (and I've voted for it :) ), I'd just add that:
"web frameworks provide the ability to map the model directly to database entities (ORM), which, IMHO, ends up causing performance issues at runtime due to direct database I/O"
Even if this is true, using an ORM can solve a lot of problems with a model being easy to update and translate back and forth between a database. Solving a performance hit by buying extra web servers or cloud instances is much cheaper than having to buy extra developers or extra hours in development to solve things other people have already written ORMs to do for you.

Strategies For AutoCompletion Web Services in .Net. Non UI focused

I'm getting a little tired of all the UI demos of auto completion in ASP.Net. I believe the UI portion of autocompletion has been solved multiple times over again.
My question is how do you best handle the queries hitting your webservices? I'm currently implementing an autocompletion service for a musician database. The database is fairly small with only 20,000 rows, but autocompletion is extremely speed sensitive. It needs to be fairly instant to be of any use.
I'm currently using NHibernate for my DAL, but I'm wondering if this is a place where I may want to bypass NHibernate. Perhaps projections on named queries would be the best strategy? Where do I cache? NHibernate's 2nd level cache? Let the web service cache?
I've already thought of a lot of naive methods to develop this, but I would like to soak in any tips that people already have in the wild. Also, what if you have many different types of entities you want autocompletion on? Do you spread those implementations around in their different repositories or do you design/implement a completely separate autocompletion service?
This depends on how large your sites traffic is. I generally suggest using a product such as MemCached or MemCached Win32 depending on your environment availability (MemCached for cheap linux boxes if you can is best...all that is needed is a ton of memory!). You might also look to something like Velocity (MS's new cache cloud offering). This would then allow you to cache a key (what ever the query is) with the results efficiently! Keep your cache times down based on however frequently you are updating your dataset. If you don't update often then the cache time can be longer. If you find that your cache cloud is growing like crazy you might want to only cache what is most frequently asked for (though your cache implementation should handle this by removing what is not accessed frequently!).

About dynamics CRM performance

My boss asked me to do a research on available CMSes on market because cms we are using currently is rather a mess.
For me as a .NET developer it would be great to choose and implement Dynamics CRM because of extensibility and perfect integration with .NET environment and well-known tools.
All marketing sounds great but I'd like to know about common DISADVANTAGES, ISSUES concerning this system.
The most important is how it is performing in a company with about 150 concurrent and very active users. I heard that it's really slow comparing to competitors system.
The Dynamics CRM Product team has published an excellent whitepaper with guidance and benchmarks for 500 concurrent users. You can learn a lot by studying this paper. The link is here:
Microsoft Dynamics CRM 4.0 Suggested Hardware for Deployments of up to 500 Concurrent Users
I can't answer regarding the number of users/activity. I can refer you to the SDK article 'Performance Best Practices'. I'll speak to the side of you that would be writing plugins (to data access messages), custom pages accessing the CRM web services, and writing SSRS reports. A couple of points I can relate to:
Disable Plug-ins. This is an attractive and major integration point into CRM. The fact that they list it as a performance issue is disheartening. We have seen OutOfMemory exceptions stemming from the plugin cache. We got around this issue by deploying to disk rather than database. In the database they reload the assembly and confirm the signature every time a plugin is called. We believe this was eating up the Large Object Heap. Probably not an issue for your normal CRM implementation.
Limit Data Retrieved. Definitely. Avoid lookups/picklists/bits you don't need when you can as these cause an extra join. Not going to be a huge deal on smaller entities. But if you need entities with a large number of attributes it could be. Probably not an issue for normal CRM customization. A good design in other cases should avoid this issue.
I can't really offer any advice on how it compares to its leading competitors. I know the main thing is that its cheaper and very actively developed.
I can say a bit about performance though which might help.
We have about 400 - 600 concurrent users using the system. The system isn't particularly web server intensive. We have two for resliency - it would be a disaster if it went offline, but these servers are never taxed. They have a couple of virtual cores and 4 gig of ram.
Our database is 130GB in size and is hosted on a 24 core database server with 48GB of RAM. It is clustered but because SQL server can't handle two active nodes, only one server is ever active.
The database server really never gets maxed out. However, there is one very important change we needed to make and one that I think MS are advising all users of large CRM installs to do now. By default SQL Server has a locking mode that will block people writing to the database when a row is being read. In our system (and numerous others apparently) that was causing huge issues.
We switched on a different mode (I think its called "snapshot isolation") or something like that. To be fair though even if you did have 200 concurrent users, it won't be any issue until the more central tables like activitypointer and account get pretty large (in the millions)
So - there is no doubt that CRM 2011 can handle that many users as long as you have some suitable hardware and have someone who understands SQL Server
HTH
S

Resources